Jan 26 09:00:41 localhost kernel: Linux version 5.14.0-661.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026
Jan 26 09:00:41 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 26 09:00:41 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 26 09:00:41 localhost kernel: BIOS-provided physical RAM map:
Jan 26 09:00:41 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 26 09:00:41 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 26 09:00:41 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 26 09:00:41 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 26 09:00:41 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 26 09:00:41 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 26 09:00:41 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 26 09:00:41 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 26 09:00:41 localhost kernel: NX (Execute Disable) protection: active
Jan 26 09:00:41 localhost kernel: APIC: Static calls initialized
Jan 26 09:00:41 localhost kernel: SMBIOS 2.8 present.
Jan 26 09:00:41 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 26 09:00:41 localhost kernel: Hypervisor detected: KVM
Jan 26 09:00:41 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 26 09:00:41 localhost kernel: kvm-clock: using sched offset of 3356242937 cycles
Jan 26 09:00:41 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 26 09:00:41 localhost kernel: tsc: Detected 2799.998 MHz processor
Jan 26 09:00:41 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Jan 26 09:00:41 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Jan 26 09:00:41 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 26 09:00:41 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 26 09:00:41 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 26 09:00:41 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 26 09:00:41 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 26 09:00:41 localhost kernel: Using GB pages for direct mapping
Jan 26 09:00:41 localhost kernel: RAMDISK: [mem 0x2d426000-0x32a0afff]
Jan 26 09:00:41 localhost kernel: ACPI: Early table checksum verification disabled
Jan 26 09:00:41 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 26 09:00:41 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 09:00:41 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 09:00:41 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 09:00:41 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 26 09:00:41 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 09:00:41 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 09:00:41 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 26 09:00:41 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 26 09:00:41 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 26 09:00:41 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 26 09:00:41 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 26 09:00:41 localhost kernel: No NUMA configuration found
Jan 26 09:00:41 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 26 09:00:41 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Jan 26 09:00:41 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 26 09:00:41 localhost kernel: Zone ranges:
Jan 26 09:00:41 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 26 09:00:41 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 26 09:00:41 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 26 09:00:41 localhost kernel:   Device   empty
Jan 26 09:00:41 localhost kernel: Movable zone start for each node
Jan 26 09:00:41 localhost kernel: Early memory node ranges
Jan 26 09:00:41 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 26 09:00:41 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 26 09:00:41 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 26 09:00:41 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 26 09:00:41 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 26 09:00:41 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 26 09:00:41 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 26 09:00:41 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Jan 26 09:00:41 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 26 09:00:41 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 26 09:00:41 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 26 09:00:41 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 26 09:00:41 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 26 09:00:41 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 26 09:00:41 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 26 09:00:41 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 26 09:00:41 localhost kernel: TSC deadline timer available
Jan 26 09:00:41 localhost kernel: CPU topo: Max. logical packages:   8
Jan 26 09:00:41 localhost kernel: CPU topo: Max. logical dies:       8
Jan 26 09:00:41 localhost kernel: CPU topo: Max. dies per package:   1
Jan 26 09:00:41 localhost kernel: CPU topo: Max. threads per core:   1
Jan 26 09:00:41 localhost kernel: CPU topo: Num. cores per package:     1
Jan 26 09:00:41 localhost kernel: CPU topo: Num. threads per package:   1
Jan 26 09:00:41 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 26 09:00:41 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 26 09:00:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 26 09:00:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 26 09:00:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 26 09:00:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 26 09:00:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 26 09:00:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 26 09:00:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 26 09:00:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 26 09:00:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 26 09:00:41 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 26 09:00:41 localhost kernel: Booting paravirtualized kernel on KVM
Jan 26 09:00:41 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 26 09:00:41 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 26 09:00:41 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 26 09:00:41 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Jan 26 09:00:41 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Jan 26 09:00:41 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 26 09:00:41 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 26 09:00:41 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64", will be passed to user space.
Jan 26 09:00:41 localhost kernel: random: crng init done
Jan 26 09:00:41 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 26 09:00:41 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 26 09:00:41 localhost kernel: Fallback order for Node 0: 0 
Jan 26 09:00:41 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 26 09:00:41 localhost kernel: Policy zone: Normal
Jan 26 09:00:41 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 26 09:00:41 localhost kernel: software IO TLB: area num 8.
Jan 26 09:00:41 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 26 09:00:41 localhost kernel: ftrace: allocating 49417 entries in 194 pages
Jan 26 09:00:41 localhost kernel: ftrace: allocated 194 pages with 3 groups
Jan 26 09:00:41 localhost kernel: Dynamic Preempt: voluntary
Jan 26 09:00:41 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 26 09:00:41 localhost kernel: rcu:         RCU event tracing is enabled.
Jan 26 09:00:41 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 26 09:00:41 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Jan 26 09:00:41 localhost kernel:         Rude variant of Tasks RCU enabled.
Jan 26 09:00:41 localhost kernel:         Tracing variant of Tasks RCU enabled.
Jan 26 09:00:41 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 26 09:00:41 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 26 09:00:41 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 26 09:00:41 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 26 09:00:41 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 26 09:00:41 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 26 09:00:41 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 26 09:00:41 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 26 09:00:41 localhost kernel: Console: colour VGA+ 80x25
Jan 26 09:00:41 localhost kernel: printk: console [ttyS0] enabled
Jan 26 09:00:41 localhost kernel: ACPI: Core revision 20230331
Jan 26 09:00:41 localhost kernel: APIC: Switch to symmetric I/O mode setup
Jan 26 09:00:41 localhost kernel: x2apic enabled
Jan 26 09:00:41 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Jan 26 09:00:41 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 26 09:00:41 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Jan 26 09:00:41 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 26 09:00:41 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 26 09:00:41 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 26 09:00:41 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 26 09:00:41 localhost kernel: Spectre V2 : Mitigation: Retpolines
Jan 26 09:00:41 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 26 09:00:41 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 26 09:00:41 localhost kernel: RETBleed: Mitigation: untrained return thunk
Jan 26 09:00:41 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 26 09:00:41 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 26 09:00:41 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Jan 26 09:00:41 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Jan 26 09:00:41 localhost kernel: x86/bugs: return thunk changed
Jan 26 09:00:41 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Jan 26 09:00:41 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 26 09:00:41 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 26 09:00:41 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 26 09:00:41 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 26 09:00:41 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 26 09:00:41 localhost kernel: Freeing SMP alternatives memory: 40K
Jan 26 09:00:41 localhost kernel: pid_max: default: 32768 minimum: 301
Jan 26 09:00:41 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 26 09:00:41 localhost kernel: landlock: Up and running.
Jan 26 09:00:41 localhost kernel: Yama: becoming mindful.
Jan 26 09:00:41 localhost kernel: SELinux:  Initializing.
Jan 26 09:00:41 localhost kernel: LSM support for eBPF active
Jan 26 09:00:41 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 26 09:00:41 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 26 09:00:41 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 26 09:00:41 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 26 09:00:41 localhost kernel: ... version:                0
Jan 26 09:00:41 localhost kernel: ... bit width:              48
Jan 26 09:00:41 localhost kernel: ... generic registers:      6
Jan 26 09:00:41 localhost kernel: ... value mask:             0000ffffffffffff
Jan 26 09:00:41 localhost kernel: ... max period:             00007fffffffffff
Jan 26 09:00:41 localhost kernel: ... fixed-purpose events:   0
Jan 26 09:00:41 localhost kernel: ... event mask:             000000000000003f
Jan 26 09:00:41 localhost kernel: signal: max sigframe size: 1776
Jan 26 09:00:41 localhost kernel: rcu: Hierarchical SRCU implementation.
Jan 26 09:00:41 localhost kernel: rcu:         Max phase no-delay instances is 400.
Jan 26 09:00:41 localhost kernel: smp: Bringing up secondary CPUs ...
Jan 26 09:00:41 localhost kernel: smpboot: x86: Booting SMP configuration:
Jan 26 09:00:41 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 26 09:00:41 localhost kernel: smp: Brought up 1 node, 8 CPUs
Jan 26 09:00:41 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Jan 26 09:00:41 localhost kernel: node 0 deferred pages initialised in 10ms
Jan 26 09:00:41 localhost kernel: Memory: 7763956K/8388068K available (16384K kernel code, 5797K rwdata, 13916K rodata, 4200K init, 7192K bss, 618368K reserved, 0K cma-reserved)
Jan 26 09:00:41 localhost kernel: devtmpfs: initialized
Jan 26 09:00:41 localhost kernel: x86/mm: Memory block size: 128MB
Jan 26 09:00:41 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 26 09:00:41 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 26 09:00:41 localhost kernel: pinctrl core: initialized pinctrl subsystem
Jan 26 09:00:41 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 26 09:00:41 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 26 09:00:41 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 26 09:00:41 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 26 09:00:41 localhost kernel: audit: initializing netlink subsys (disabled)
Jan 26 09:00:41 localhost kernel: audit: type=2000 audit(1769418039.770:1): state=initialized audit_enabled=0 res=1
Jan 26 09:00:41 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 26 09:00:41 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 26 09:00:41 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 26 09:00:41 localhost kernel: cpuidle: using governor menu
Jan 26 09:00:41 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 26 09:00:41 localhost kernel: PCI: Using configuration type 1 for base access
Jan 26 09:00:41 localhost kernel: PCI: Using configuration type 1 for extended access
Jan 26 09:00:41 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 26 09:00:41 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 26 09:00:41 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 26 09:00:41 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 26 09:00:41 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 26 09:00:41 localhost kernel: Demotion targets for Node 0: null
Jan 26 09:00:41 localhost kernel: cryptd: max_cpu_qlen set to 1000
Jan 26 09:00:41 localhost kernel: ACPI: Added _OSI(Module Device)
Jan 26 09:00:41 localhost kernel: ACPI: Added _OSI(Processor Device)
Jan 26 09:00:41 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 26 09:00:41 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 26 09:00:41 localhost kernel: ACPI: Interpreter enabled
Jan 26 09:00:41 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 26 09:00:41 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Jan 26 09:00:41 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 26 09:00:41 localhost kernel: PCI: Using E820 reservations for host bridge windows
Jan 26 09:00:41 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 26 09:00:41 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 26 09:00:41 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [3] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [4] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [5] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [6] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [7] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [8] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [9] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [10] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [11] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [12] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [13] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [14] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [15] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [16] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [17] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [18] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [19] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [20] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [21] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [22] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [23] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [24] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [25] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [26] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [27] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [28] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [29] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [30] registered
Jan 26 09:00:41 localhost kernel: acpiphp: Slot [31] registered
Jan 26 09:00:41 localhost kernel: PCI host bridge to bus 0000:00
Jan 26 09:00:41 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 26 09:00:41 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 26 09:00:41 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 26 09:00:41 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 26 09:00:41 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 26 09:00:41 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 26 09:00:41 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 26 09:00:41 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 26 09:00:41 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 26 09:00:41 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 26 09:00:41 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 26 09:00:41 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 26 09:00:41 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 26 09:00:41 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 26 09:00:41 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 26 09:00:41 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 26 09:00:41 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 26 09:00:41 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 26 09:00:41 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 26 09:00:41 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 26 09:00:41 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 26 09:00:41 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 26 09:00:41 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 26 09:00:41 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 26 09:00:41 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 26 09:00:41 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 26 09:00:41 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 26 09:00:41 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 26 09:00:41 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 26 09:00:41 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 26 09:00:41 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 26 09:00:41 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 26 09:00:41 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 26 09:00:41 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 26 09:00:41 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 26 09:00:41 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 26 09:00:41 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 26 09:00:41 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 26 09:00:41 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 26 09:00:41 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 26 09:00:41 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 26 09:00:41 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 26 09:00:41 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 26 09:00:41 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 26 09:00:41 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 26 09:00:41 localhost kernel: iommu: Default domain type: Translated
Jan 26 09:00:41 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 26 09:00:41 localhost kernel: SCSI subsystem initialized
Jan 26 09:00:41 localhost kernel: ACPI: bus type USB registered
Jan 26 09:00:41 localhost kernel: usbcore: registered new interface driver usbfs
Jan 26 09:00:41 localhost kernel: usbcore: registered new interface driver hub
Jan 26 09:00:41 localhost kernel: usbcore: registered new device driver usb
Jan 26 09:00:41 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 26 09:00:41 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 26 09:00:41 localhost kernel: PTP clock support registered
Jan 26 09:00:41 localhost kernel: EDAC MC: Ver: 3.0.0
Jan 26 09:00:41 localhost kernel: NetLabel: Initializing
Jan 26 09:00:41 localhost kernel: NetLabel:  domain hash size = 128
Jan 26 09:00:41 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 26 09:00:41 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Jan 26 09:00:41 localhost kernel: PCI: Using ACPI for IRQ routing
Jan 26 09:00:41 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Jan 26 09:00:41 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Jan 26 09:00:41 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Jan 26 09:00:41 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 26 09:00:41 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 26 09:00:41 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 26 09:00:41 localhost kernel: vgaarb: loaded
Jan 26 09:00:41 localhost kernel: clocksource: Switched to clocksource kvm-clock
Jan 26 09:00:41 localhost kernel: VFS: Disk quotas dquot_6.6.0
Jan 26 09:00:41 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 26 09:00:41 localhost kernel: pnp: PnP ACPI init
Jan 26 09:00:41 localhost kernel: pnp 00:03: [dma 2]
Jan 26 09:00:41 localhost kernel: pnp: PnP ACPI: found 5 devices
Jan 26 09:00:41 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 26 09:00:41 localhost kernel: NET: Registered PF_INET protocol family
Jan 26 09:00:41 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 26 09:00:41 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 26 09:00:41 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 26 09:00:41 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 26 09:00:41 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 26 09:00:41 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 26 09:00:41 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 26 09:00:41 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 26 09:00:41 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 26 09:00:41 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 26 09:00:41 localhost kernel: NET: Registered PF_XDP protocol family
Jan 26 09:00:41 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 26 09:00:41 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 26 09:00:41 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 26 09:00:41 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 26 09:00:41 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 26 09:00:41 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 26 09:00:41 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 26 09:00:41 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 26 09:00:41 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 70794 usecs
Jan 26 09:00:41 localhost kernel: PCI: CLS 0 bytes, default 64
Jan 26 09:00:41 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 26 09:00:41 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 26 09:00:41 localhost kernel: ACPI: bus type thunderbolt registered
Jan 26 09:00:41 localhost kernel: Trying to unpack rootfs image as initramfs...
Jan 26 09:00:41 localhost kernel: Initialise system trusted keyrings
Jan 26 09:00:41 localhost kernel: Key type blacklist registered
Jan 26 09:00:41 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 26 09:00:41 localhost kernel: zbud: loaded
Jan 26 09:00:41 localhost kernel: integrity: Platform Keyring initialized
Jan 26 09:00:41 localhost kernel: integrity: Machine keyring initialized
Jan 26 09:00:41 localhost kernel: Freeing initrd memory: 87956K
Jan 26 09:00:41 localhost kernel: NET: Registered PF_ALG protocol family
Jan 26 09:00:41 localhost kernel: xor: automatically using best checksumming function   avx       
Jan 26 09:00:41 localhost kernel: Key type asymmetric registered
Jan 26 09:00:41 localhost kernel: Asymmetric key parser 'x509' registered
Jan 26 09:00:41 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 26 09:00:41 localhost kernel: io scheduler mq-deadline registered
Jan 26 09:00:41 localhost kernel: io scheduler kyber registered
Jan 26 09:00:41 localhost kernel: io scheduler bfq registered
Jan 26 09:00:41 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 26 09:00:41 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 26 09:00:41 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 26 09:00:41 localhost kernel: ACPI: button: Power Button [PWRF]
Jan 26 09:00:41 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 26 09:00:41 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 26 09:00:41 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 26 09:00:41 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 26 09:00:41 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 26 09:00:41 localhost kernel: Non-volatile memory driver v1.3
Jan 26 09:00:41 localhost kernel: rdac: device handler registered
Jan 26 09:00:41 localhost kernel: hp_sw: device handler registered
Jan 26 09:00:41 localhost kernel: emc: device handler registered
Jan 26 09:00:41 localhost kernel: alua: device handler registered
Jan 26 09:00:41 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 26 09:00:41 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 26 09:00:41 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 26 09:00:41 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 26 09:00:41 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 26 09:00:41 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 26 09:00:41 localhost kernel: usb usb1: Product: UHCI Host Controller
Jan 26 09:00:41 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-661.el9.x86_64 uhci_hcd
Jan 26 09:00:41 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 26 09:00:41 localhost kernel: hub 1-0:1.0: USB hub found
Jan 26 09:00:41 localhost kernel: hub 1-0:1.0: 2 ports detected
Jan 26 09:00:41 localhost kernel: usbcore: registered new interface driver usbserial_generic
Jan 26 09:00:41 localhost kernel: usbserial: USB Serial support registered for generic
Jan 26 09:00:41 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 26 09:00:41 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 26 09:00:41 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 26 09:00:41 localhost kernel: mousedev: PS/2 mouse device common for all mice
Jan 26 09:00:41 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 26 09:00:41 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 26 09:00:41 localhost kernel: rtc_cmos 00:04: registered as rtc0
Jan 26 09:00:41 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-01-26T09:00:40 UTC (1769418040)
Jan 26 09:00:41 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 26 09:00:41 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 26 09:00:41 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 26 09:00:41 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 26 09:00:41 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 26 09:00:41 localhost kernel: usbcore: registered new interface driver usbhid
Jan 26 09:00:41 localhost kernel: usbhid: USB HID core driver
Jan 26 09:00:41 localhost kernel: drop_monitor: Initializing network drop monitor service
Jan 26 09:00:41 localhost kernel: Initializing XFRM netlink socket
Jan 26 09:00:41 localhost kernel: NET: Registered PF_INET6 protocol family
Jan 26 09:00:41 localhost kernel: Segment Routing with IPv6
Jan 26 09:00:41 localhost kernel: NET: Registered PF_PACKET protocol family
Jan 26 09:00:41 localhost kernel: mpls_gso: MPLS GSO support
Jan 26 09:00:41 localhost kernel: IPI shorthand broadcast: enabled
Jan 26 09:00:41 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Jan 26 09:00:41 localhost kernel: AES CTR mode by8 optimization enabled
Jan 26 09:00:41 localhost kernel: sched_clock: Marking stable (1190004076, 156305756)->(1459592791, -113282959)
Jan 26 09:00:41 localhost kernel: registered taskstats version 1
Jan 26 09:00:41 localhost kernel: Loading compiled-in X.509 certificates
Jan 26 09:00:41 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 26 09:00:41 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 26 09:00:41 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 26 09:00:41 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 26 09:00:41 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 26 09:00:41 localhost kernel: Demotion targets for Node 0: null
Jan 26 09:00:41 localhost kernel: page_owner is disabled
Jan 26 09:00:41 localhost kernel: Key type .fscrypt registered
Jan 26 09:00:41 localhost kernel: Key type fscrypt-provisioning registered
Jan 26 09:00:41 localhost kernel: Key type big_key registered
Jan 26 09:00:41 localhost kernel: Key type encrypted registered
Jan 26 09:00:41 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 26 09:00:41 localhost kernel: Loading compiled-in module X.509 certificates
Jan 26 09:00:41 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 26 09:00:41 localhost kernel: ima: Allocated hash algorithm: sha256
Jan 26 09:00:41 localhost kernel: ima: No architecture policies found
Jan 26 09:00:41 localhost kernel: evm: Initialising EVM extended attributes:
Jan 26 09:00:41 localhost kernel: evm: security.selinux
Jan 26 09:00:41 localhost kernel: evm: security.SMACK64 (disabled)
Jan 26 09:00:41 localhost kernel: evm: security.SMACK64EXEC (disabled)
Jan 26 09:00:41 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 26 09:00:41 localhost kernel: evm: security.SMACK64MMAP (disabled)
Jan 26 09:00:41 localhost kernel: evm: security.apparmor (disabled)
Jan 26 09:00:41 localhost kernel: evm: security.ima
Jan 26 09:00:41 localhost kernel: evm: security.capability
Jan 26 09:00:41 localhost kernel: evm: HMAC attrs: 0x1
Jan 26 09:00:41 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 26 09:00:41 localhost kernel: Running certificate verification RSA selftest
Jan 26 09:00:41 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 26 09:00:41 localhost kernel: Running certificate verification ECDSA selftest
Jan 26 09:00:41 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 26 09:00:41 localhost kernel: clk: Disabling unused clocks
Jan 26 09:00:41 localhost kernel: Freeing unused decrypted memory: 2028K
Jan 26 09:00:41 localhost kernel: Freeing unused kernel image (initmem) memory: 4200K
Jan 26 09:00:41 localhost kernel: Write protecting the kernel read-only data: 30720k
Jan 26 09:00:41 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Jan 26 09:00:41 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 26 09:00:41 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 26 09:00:41 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Jan 26 09:00:41 localhost kernel: usb 1-1: Manufacturer: QEMU
Jan 26 09:00:41 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 26 09:00:41 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 26 09:00:41 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 26 09:00:41 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 26 09:00:41 localhost kernel: Run /init as init process
Jan 26 09:00:41 localhost kernel:   with arguments:
Jan 26 09:00:41 localhost kernel:     /init
Jan 26 09:00:41 localhost kernel:   with environment:
Jan 26 09:00:41 localhost kernel:     HOME=/
Jan 26 09:00:41 localhost kernel:     TERM=linux
Jan 26 09:00:41 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64
Jan 26 09:00:41 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 26 09:00:41 localhost systemd[1]: Detected virtualization kvm.
Jan 26 09:00:41 localhost systemd[1]: Detected architecture x86-64.
Jan 26 09:00:41 localhost systemd[1]: Running in initrd.
Jan 26 09:00:41 localhost systemd[1]: No hostname configured, using default hostname.
Jan 26 09:00:41 localhost systemd[1]: Hostname set to <localhost>.
Jan 26 09:00:41 localhost systemd[1]: Initializing machine ID from VM UUID.
Jan 26 09:00:41 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Jan 26 09:00:41 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 26 09:00:41 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 26 09:00:41 localhost systemd[1]: Reached target Initrd /usr File System.
Jan 26 09:00:41 localhost systemd[1]: Reached target Local File Systems.
Jan 26 09:00:41 localhost systemd[1]: Reached target Path Units.
Jan 26 09:00:41 localhost systemd[1]: Reached target Slice Units.
Jan 26 09:00:41 localhost systemd[1]: Reached target Swaps.
Jan 26 09:00:41 localhost systemd[1]: Reached target Timer Units.
Jan 26 09:00:41 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 26 09:00:41 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Jan 26 09:00:41 localhost systemd[1]: Listening on Journal Socket.
Jan 26 09:00:41 localhost systemd[1]: Listening on udev Control Socket.
Jan 26 09:00:41 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 26 09:00:41 localhost systemd[1]: Reached target Socket Units.
Jan 26 09:00:41 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 26 09:00:41 localhost systemd[1]: Starting Journal Service...
Jan 26 09:00:41 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 26 09:00:41 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 26 09:00:41 localhost systemd[1]: Starting Create System Users...
Jan 26 09:00:41 localhost systemd[1]: Starting Setup Virtual Console...
Jan 26 09:00:41 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 26 09:00:41 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 26 09:00:41 localhost systemd[1]: Finished Create System Users.
Jan 26 09:00:41 localhost systemd-journald[303]: Journal started
Jan 26 09:00:41 localhost systemd-journald[303]: Runtime Journal (/run/log/journal/0657a708098a4137a4d88ea25323424c) is 8.0M, max 153.6M, 145.6M free.
Jan 26 09:00:41 localhost systemd-sysusers[308]: Creating group 'users' with GID 100.
Jan 26 09:00:41 localhost systemd-sysusers[308]: Creating group 'dbus' with GID 81.
Jan 26 09:00:41 localhost systemd-sysusers[308]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 26 09:00:41 localhost systemd[1]: Started Journal Service.
Jan 26 09:00:41 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 26 09:00:41 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 26 09:00:41 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 26 09:00:41 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 26 09:00:41 localhost systemd[1]: Finished Setup Virtual Console.
Jan 26 09:00:41 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 26 09:00:41 localhost systemd[1]: Starting dracut cmdline hook...
Jan 26 09:00:41 localhost dracut-cmdline[324]: dracut-9 dracut-057-102.git20250818.el9
Jan 26 09:00:41 localhost dracut-cmdline[324]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 26 09:00:41 localhost systemd[1]: Finished dracut cmdline hook.
Jan 26 09:00:41 localhost systemd[1]: Starting dracut pre-udev hook...
Jan 26 09:00:41 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 26 09:00:41 localhost kernel: device-mapper: uevent: version 1.0.3
Jan 26 09:00:41 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 26 09:00:41 localhost kernel: RPC: Registered named UNIX socket transport module.
Jan 26 09:00:41 localhost kernel: RPC: Registered udp transport module.
Jan 26 09:00:41 localhost kernel: RPC: Registered tcp transport module.
Jan 26 09:00:41 localhost kernel: RPC: Registered tcp-with-tls transport module.
Jan 26 09:00:41 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 26 09:00:41 localhost rpc.statd[443]: Version 2.5.4 starting
Jan 26 09:00:41 localhost rpc.statd[443]: Initializing NSM state
Jan 26 09:00:41 localhost rpc.idmapd[448]: Setting log level to 0
Jan 26 09:00:41 localhost systemd[1]: Finished dracut pre-udev hook.
Jan 26 09:00:41 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 26 09:00:41 localhost systemd-udevd[461]: Using default interface naming scheme 'rhel-9.0'.
Jan 26 09:00:41 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 26 09:00:41 localhost systemd[1]: Starting dracut pre-trigger hook...
Jan 26 09:00:41 localhost systemd[1]: Finished dracut pre-trigger hook.
Jan 26 09:00:41 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 26 09:00:41 localhost systemd[1]: Created slice Slice /system/modprobe.
Jan 26 09:00:41 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 26 09:00:41 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 26 09:00:41 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 26 09:00:41 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 26 09:00:41 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 26 09:00:41 localhost systemd[1]: Reached target Network.
Jan 26 09:00:41 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 26 09:00:41 localhost systemd[1]: Starting dracut initqueue hook...
Jan 26 09:00:42 localhost kernel: libata version 3.00 loaded.
Jan 26 09:00:42 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 26 09:00:42 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Jan 26 09:00:42 localhost kernel: scsi host0: ata_piix
Jan 26 09:00:42 localhost kernel: scsi host1: ata_piix
Jan 26 09:00:42 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 26 09:00:42 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 26 09:00:42 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 26 09:00:42 localhost kernel:  vda: vda1
Jan 26 09:00:42 localhost systemd[1]: Mounting Kernel Configuration File System...
Jan 26 09:00:42 localhost systemd[1]: Mounted Kernel Configuration File System.
Jan 26 09:00:42 localhost systemd[1]: Reached target System Initialization.
Jan 26 09:00:42 localhost systemd[1]: Reached target Basic System.
Jan 26 09:00:42 localhost kernel: ata1: found unknown device (class 0)
Jan 26 09:00:42 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 26 09:00:42 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 26 09:00:42 localhost systemd-udevd[464]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 09:00:42 localhost systemd[1]: Found device /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 26 09:00:42 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 26 09:00:42 localhost systemd[1]: Reached target Initrd Root Device.
Jan 26 09:00:42 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 26 09:00:42 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 26 09:00:42 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Jan 26 09:00:42 localhost systemd[1]: Finished dracut initqueue hook.
Jan 26 09:00:42 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Jan 26 09:00:42 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Jan 26 09:00:42 localhost systemd[1]: Reached target Remote File Systems.
Jan 26 09:00:42 localhost systemd[1]: Starting dracut pre-mount hook...
Jan 26 09:00:42 localhost systemd[1]: Finished dracut pre-mount hook.
Jan 26 09:00:42 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40...
Jan 26 09:00:42 localhost systemd-fsck[556]: /usr/sbin/fsck.xfs: XFS file system.
Jan 26 09:00:42 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 26 09:00:42 localhost systemd[1]: Mounting /sysroot...
Jan 26 09:00:42 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 26 09:00:42 localhost kernel: XFS (vda1): Mounting V5 Filesystem 22ac9141-3960-4912-b20e-19fc8a328d40
Jan 26 09:00:42 localhost kernel: XFS (vda1): Ending clean mount
Jan 26 09:00:43 localhost systemd[1]: Mounted /sysroot.
Jan 26 09:00:43 localhost systemd[1]: Reached target Initrd Root File System.
Jan 26 09:00:43 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 26 09:00:43 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 26 09:00:43 localhost systemd[1]: Reached target Initrd File Systems.
Jan 26 09:00:43 localhost systemd[1]: Reached target Initrd Default Target.
Jan 26 09:00:43 localhost systemd[1]: Starting dracut mount hook...
Jan 26 09:00:43 localhost systemd[1]: Finished dracut mount hook.
Jan 26 09:00:43 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 26 09:00:43 localhost rpc.idmapd[448]: exiting on signal 15
Jan 26 09:00:43 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 26 09:00:43 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 26 09:00:43 localhost systemd[1]: Stopped target Network.
Jan 26 09:00:43 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 26 09:00:43 localhost systemd[1]: Stopped target Timer Units.
Jan 26 09:00:43 localhost systemd[1]: dbus.socket: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 26 09:00:43 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 26 09:00:43 localhost systemd[1]: Stopped target Initrd Default Target.
Jan 26 09:00:43 localhost systemd[1]: Stopped target Basic System.
Jan 26 09:00:43 localhost systemd[1]: Stopped target Initrd Root Device.
Jan 26 09:00:43 localhost systemd[1]: Stopped target Initrd /usr File System.
Jan 26 09:00:43 localhost systemd[1]: Stopped target Path Units.
Jan 26 09:00:43 localhost systemd[1]: Stopped target Remote File Systems.
Jan 26 09:00:43 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 26 09:00:43 localhost systemd[1]: Stopped target Slice Units.
Jan 26 09:00:43 localhost systemd[1]: Stopped target Socket Units.
Jan 26 09:00:43 localhost systemd[1]: Stopped target System Initialization.
Jan 26 09:00:43 localhost systemd[1]: Stopped target Local File Systems.
Jan 26 09:00:43 localhost systemd[1]: Stopped target Swaps.
Jan 26 09:00:43 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: Stopped dracut mount hook.
Jan 26 09:00:43 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: Stopped dracut pre-mount hook.
Jan 26 09:00:43 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Jan 26 09:00:43 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 26 09:00:43 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: Stopped dracut initqueue hook.
Jan 26 09:00:43 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: Stopped Apply Kernel Variables.
Jan 26 09:00:43 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Jan 26 09:00:43 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: Stopped Coldplug All udev Devices.
Jan 26 09:00:43 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: Stopped dracut pre-trigger hook.
Jan 26 09:00:43 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 26 09:00:43 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: Stopped Setup Virtual Console.
Jan 26 09:00:43 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 26 09:00:43 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 26 09:00:43 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: Closed udev Control Socket.
Jan 26 09:00:43 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: Closed udev Kernel Socket.
Jan 26 09:00:43 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: Stopped dracut pre-udev hook.
Jan 26 09:00:43 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: Stopped dracut cmdline hook.
Jan 26 09:00:43 localhost systemd[1]: Starting Cleanup udev Database...
Jan 26 09:00:43 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 26 09:00:43 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Jan 26 09:00:43 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: Stopped Create System Users.
Jan 26 09:00:43 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 26 09:00:43 localhost systemd[1]: Finished Cleanup udev Database.
Jan 26 09:00:43 localhost systemd[1]: Reached target Switch Root.
Jan 26 09:00:43 localhost systemd[1]: Starting Switch Root...
Jan 26 09:00:43 localhost systemd[1]: Switching root.
Jan 26 09:00:43 localhost systemd-journald[303]: Journal stopped
Jan 26 09:00:44 localhost systemd-journald[303]: Received SIGTERM from PID 1 (systemd).
Jan 26 09:00:44 localhost kernel: audit: type=1404 audit(1769418043.412:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 26 09:00:44 localhost kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 09:00:44 localhost kernel: SELinux:  policy capability open_perms=1
Jan 26 09:00:44 localhost kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 09:00:44 localhost kernel: SELinux:  policy capability always_check_network=0
Jan 26 09:00:44 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 09:00:44 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 09:00:44 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 09:00:44 localhost kernel: audit: type=1403 audit(1769418043.554:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 26 09:00:44 localhost systemd[1]: Successfully loaded SELinux policy in 145.312ms.
Jan 26 09:00:44 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 31.394ms.
Jan 26 09:00:44 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 26 09:00:44 localhost systemd[1]: Detected virtualization kvm.
Jan 26 09:00:44 localhost systemd[1]: Detected architecture x86-64.
Jan 26 09:00:44 localhost systemd-rc-local-generator[636]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:00:44 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Jan 26 09:00:44 localhost systemd[1]: Stopped Switch Root.
Jan 26 09:00:44 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 26 09:00:44 localhost systemd[1]: Created slice Slice /system/getty.
Jan 26 09:00:44 localhost systemd[1]: Created slice Slice /system/serial-getty.
Jan 26 09:00:44 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Jan 26 09:00:44 localhost systemd[1]: Created slice User and Session Slice.
Jan 26 09:00:44 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 26 09:00:44 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Jan 26 09:00:44 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 26 09:00:44 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 26 09:00:44 localhost systemd[1]: Stopped target Switch Root.
Jan 26 09:00:44 localhost systemd[1]: Stopped target Initrd File Systems.
Jan 26 09:00:44 localhost systemd[1]: Stopped target Initrd Root File System.
Jan 26 09:00:44 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Jan 26 09:00:44 localhost systemd[1]: Reached target Path Units.
Jan 26 09:00:44 localhost systemd[1]: Reached target rpc_pipefs.target.
Jan 26 09:00:44 localhost systemd[1]: Reached target Slice Units.
Jan 26 09:00:44 localhost systemd[1]: Reached target Swaps.
Jan 26 09:00:44 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Jan 26 09:00:44 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Jan 26 09:00:44 localhost systemd[1]: Reached target RPC Port Mapper.
Jan 26 09:00:44 localhost systemd[1]: Listening on Process Core Dump Socket.
Jan 26 09:00:44 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Jan 26 09:00:44 localhost systemd[1]: Listening on udev Control Socket.
Jan 26 09:00:44 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 26 09:00:44 localhost systemd[1]: Mounting Huge Pages File System...
Jan 26 09:00:44 localhost systemd[1]: Mounting POSIX Message Queue File System...
Jan 26 09:00:44 localhost systemd[1]: Mounting Kernel Debug File System...
Jan 26 09:00:44 localhost systemd[1]: Mounting Kernel Trace File System...
Jan 26 09:00:44 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 26 09:00:44 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 26 09:00:44 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 26 09:00:44 localhost systemd[1]: Starting Load Kernel Module drm...
Jan 26 09:00:44 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Jan 26 09:00:44 localhost systemd[1]: Starting Load Kernel Module fuse...
Jan 26 09:00:44 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 26 09:00:44 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Jan 26 09:00:44 localhost systemd[1]: Stopped File System Check on Root Device.
Jan 26 09:00:44 localhost systemd[1]: Stopped Journal Service.
Jan 26 09:00:44 localhost systemd[1]: Starting Journal Service...
Jan 26 09:00:44 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 26 09:00:44 localhost systemd[1]: Starting Generate network units from Kernel command line...
Jan 26 09:00:44 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 26 09:00:44 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Jan 26 09:00:44 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 26 09:00:44 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 26 09:00:44 localhost kernel: fuse: init (API version 7.37)
Jan 26 09:00:44 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 26 09:00:44 localhost systemd-journald[677]: Journal started
Jan 26 09:00:44 localhost systemd-journald[677]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 26 09:00:43 localhost systemd[1]: Queued start job for default target Multi-User System.
Jan 26 09:00:43 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 26 09:00:44 localhost systemd[1]: Started Journal Service.
Jan 26 09:00:44 localhost systemd[1]: Mounted Huge Pages File System.
Jan 26 09:00:44 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 26 09:00:44 localhost systemd[1]: Mounted POSIX Message Queue File System.
Jan 26 09:00:44 localhost systemd[1]: Mounted Kernel Debug File System.
Jan 26 09:00:44 localhost systemd[1]: Mounted Kernel Trace File System.
Jan 26 09:00:44 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 26 09:00:44 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 26 09:00:44 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 26 09:00:44 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 26 09:00:44 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 26 09:00:44 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 26 09:00:44 localhost systemd[1]: Finished Load Kernel Module fuse.
Jan 26 09:00:44 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 26 09:00:44 localhost systemd[1]: Finished Generate network units from Kernel command line.
Jan 26 09:00:44 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 26 09:00:44 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 26 09:00:44 localhost systemd[1]: Mounting FUSE Control File System...
Jan 26 09:00:44 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 26 09:00:44 localhost systemd[1]: Starting Rebuild Hardware Database...
Jan 26 09:00:44 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 26 09:00:44 localhost kernel: ACPI: bus type drm_connector registered
Jan 26 09:00:44 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 26 09:00:44 localhost systemd[1]: Starting Load/Save OS Random Seed...
Jan 26 09:00:44 localhost systemd-journald[677]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 26 09:00:44 localhost systemd-journald[677]: Received client request to flush runtime journal.
Jan 26 09:00:44 localhost systemd[1]: Starting Create System Users...
Jan 26 09:00:44 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 26 09:00:44 localhost systemd[1]: Finished Load Kernel Module drm.
Jan 26 09:00:44 localhost systemd[1]: Mounted FUSE Control File System.
Jan 26 09:00:44 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 26 09:00:44 localhost systemd[1]: Finished Load/Save OS Random Seed.
Jan 26 09:00:44 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 26 09:00:44 localhost systemd[1]: Finished Create System Users.
Jan 26 09:00:44 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 26 09:00:44 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 26 09:00:44 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 26 09:00:44 localhost systemd[1]: Reached target Preparation for Local File Systems.
Jan 26 09:00:44 localhost systemd[1]: Reached target Local File Systems.
Jan 26 09:00:44 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 26 09:00:44 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 26 09:00:44 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 26 09:00:44 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 26 09:00:44 localhost systemd[1]: Starting Automatic Boot Loader Update...
Jan 26 09:00:44 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 26 09:00:44 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 26 09:00:44 localhost bootctl[694]: Couldn't find EFI system partition, skipping.
Jan 26 09:00:44 localhost systemd[1]: Finished Automatic Boot Loader Update.
Jan 26 09:00:44 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 26 09:00:44 localhost systemd[1]: Starting Security Auditing Service...
Jan 26 09:00:44 localhost systemd[1]: Starting RPC Bind...
Jan 26 09:00:44 localhost systemd[1]: Starting Rebuild Journal Catalog...
Jan 26 09:00:44 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 26 09:00:44 localhost auditd[700]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 26 09:00:44 localhost auditd[700]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 26 09:00:44 localhost systemd[1]: Started RPC Bind.
Jan 26 09:00:44 localhost systemd[1]: Finished Rebuild Journal Catalog.
Jan 26 09:00:44 localhost augenrules[705]: /sbin/augenrules: No change
Jan 26 09:00:44 localhost augenrules[720]: No rules
Jan 26 09:00:44 localhost augenrules[720]: enabled 1
Jan 26 09:00:44 localhost augenrules[720]: failure 1
Jan 26 09:00:44 localhost augenrules[720]: pid 700
Jan 26 09:00:44 localhost augenrules[720]: rate_limit 0
Jan 26 09:00:44 localhost augenrules[720]: backlog_limit 8192
Jan 26 09:00:44 localhost augenrules[720]: lost 0
Jan 26 09:00:44 localhost augenrules[720]: backlog 3
Jan 26 09:00:44 localhost augenrules[720]: backlog_wait_time 60000
Jan 26 09:00:44 localhost augenrules[720]: backlog_wait_time_actual 0
Jan 26 09:00:44 localhost augenrules[720]: enabled 1
Jan 26 09:00:44 localhost augenrules[720]: failure 1
Jan 26 09:00:44 localhost augenrules[720]: pid 700
Jan 26 09:00:44 localhost augenrules[720]: rate_limit 0
Jan 26 09:00:44 localhost augenrules[720]: backlog_limit 8192
Jan 26 09:00:44 localhost augenrules[720]: lost 0
Jan 26 09:00:44 localhost augenrules[720]: backlog 0
Jan 26 09:00:44 localhost augenrules[720]: backlog_wait_time 60000
Jan 26 09:00:44 localhost augenrules[720]: backlog_wait_time_actual 0
Jan 26 09:00:44 localhost augenrules[720]: enabled 1
Jan 26 09:00:44 localhost augenrules[720]: failure 1
Jan 26 09:00:44 localhost augenrules[720]: pid 700
Jan 26 09:00:44 localhost augenrules[720]: rate_limit 0
Jan 26 09:00:44 localhost augenrules[720]: backlog_limit 8192
Jan 26 09:00:44 localhost augenrules[720]: lost 0
Jan 26 09:00:44 localhost augenrules[720]: backlog 3
Jan 26 09:00:44 localhost augenrules[720]: backlog_wait_time 60000
Jan 26 09:00:44 localhost augenrules[720]: backlog_wait_time_actual 0
Jan 26 09:00:44 localhost systemd[1]: Started Security Auditing Service.
Jan 26 09:00:44 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 26 09:00:44 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 26 09:00:44 localhost systemd[1]: Finished Rebuild Hardware Database.
Jan 26 09:00:44 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 26 09:00:44 localhost systemd[1]: Starting Update is Completed...
Jan 26 09:00:44 localhost systemd[1]: Finished Update is Completed.
Jan 26 09:00:45 localhost systemd-udevd[728]: Using default interface naming scheme 'rhel-9.0'.
Jan 26 09:00:45 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 26 09:00:45 localhost systemd[1]: Reached target System Initialization.
Jan 26 09:00:45 localhost systemd[1]: Started dnf makecache --timer.
Jan 26 09:00:45 localhost systemd[1]: Started Daily rotation of log files.
Jan 26 09:00:45 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 26 09:00:45 localhost systemd[1]: Reached target Timer Units.
Jan 26 09:00:45 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 26 09:00:45 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 26 09:00:45 localhost systemd[1]: Reached target Socket Units.
Jan 26 09:00:45 localhost systemd-udevd[737]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 09:00:45 localhost systemd[1]: Starting D-Bus System Message Bus...
Jan 26 09:00:45 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 26 09:00:45 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 26 09:00:45 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 26 09:00:45 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 26 09:00:45 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 26 09:00:45 localhost systemd[1]: Started D-Bus System Message Bus.
Jan 26 09:00:45 localhost systemd[1]: Reached target Basic System.
Jan 26 09:00:45 localhost dbus-broker-lau[764]: Ready
Jan 26 09:00:45 localhost systemd[1]: Starting NTP client/server...
Jan 26 09:00:45 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 26 09:00:45 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 26 09:00:45 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 26 09:00:45 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 26 09:00:45 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 26 09:00:45 localhost systemd[1]: Starting IPv4 firewall with iptables...
Jan 26 09:00:45 localhost systemd[1]: Started irqbalance daemon.
Jan 26 09:00:45 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 26 09:00:45 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 09:00:45 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 09:00:45 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 09:00:45 localhost systemd[1]: Reached target sshd-keygen.target.
Jan 26 09:00:45 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 26 09:00:45 localhost systemd[1]: Reached target User and Group Name Lookups.
Jan 26 09:00:45 localhost systemd[1]: Starting User Login Management...
Jan 26 09:00:45 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 26 09:00:45 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 26 09:00:45 localhost chronyd[791]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 26 09:00:45 localhost chronyd[791]: Loaded 0 symmetric keys
Jan 26 09:00:45 localhost chronyd[791]: Using right/UTC timezone to obtain leap second data
Jan 26 09:00:45 localhost chronyd[791]: Loaded seccomp filter (level 2)
Jan 26 09:00:45 localhost systemd[1]: Started NTP client/server.
Jan 26 09:00:45 localhost systemd-logind[783]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 26 09:00:45 localhost systemd-logind[783]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 26 09:00:45 localhost systemd-logind[783]: New seat seat0.
Jan 26 09:00:45 localhost systemd[1]: Started User Login Management.
Jan 26 09:00:45 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 26 09:00:45 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 26 09:00:45 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 26 09:00:45 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 26 09:00:45 localhost kernel: Console: switching to colour dummy device 80x25
Jan 26 09:00:45 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 26 09:00:45 localhost kernel: [drm] features: -context_init
Jan 26 09:00:45 localhost kernel: [drm] number of scanouts: 1
Jan 26 09:00:45 localhost kernel: [drm] number of cap sets: 0
Jan 26 09:00:45 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 26 09:00:45 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 26 09:00:45 localhost kernel: Console: switching to colour frame buffer device 128x48
Jan 26 09:00:45 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 26 09:00:45 localhost kernel: kvm_amd: TSC scaling supported
Jan 26 09:00:45 localhost kernel: kvm_amd: Nested Virtualization enabled
Jan 26 09:00:45 localhost kernel: kvm_amd: Nested Paging enabled
Jan 26 09:00:45 localhost kernel: kvm_amd: LBR virtualization supported
Jan 26 09:00:45 localhost iptables.init[776]: iptables: Applying firewall rules: [  OK  ]
Jan 26 09:00:45 localhost systemd[1]: Finished IPv4 firewall with iptables.
Jan 26 09:00:45 localhost cloud-init[838]: Cloud-init v. 24.4-8.el9 running 'init-local' at Mon, 26 Jan 2026 09:00:45 +0000. Up 6.25 seconds.
Jan 26 09:00:45 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Jan 26 09:00:45 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Jan 26 09:00:45 localhost systemd[1]: run-cloud\x2dinit-tmp-tmp75cc9w6z.mount: Deactivated successfully.
Jan 26 09:00:45 localhost systemd[1]: Starting Hostname Service...
Jan 26 09:00:45 localhost systemd[1]: Started Hostname Service.
Jan 26 09:00:45 np0005595445.novalocal systemd-hostnamed[852]: Hostname set to <np0005595445.novalocal> (static)
Jan 26 09:00:46 np0005595445.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 26 09:00:46 np0005595445.novalocal systemd[1]: Reached target Preparation for Network.
Jan 26 09:00:46 np0005595445.novalocal systemd[1]: Starting Network Manager...
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1345] NetworkManager (version 1.54.3-2.el9) is starting... (boot:ddaf00d4-1dc5-4d7f-b5ab-626f2ab79e8a)
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1350] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1413] manager[0x5638e96c4000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1450] hostname: hostname: using hostnamed
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1451] hostname: static hostname changed from (none) to "np0005595445.novalocal"
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1456] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1564] manager[0x5638e96c4000]: rfkill: Wi-Fi hardware radio set enabled
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1565] manager[0x5638e96c4000]: rfkill: WWAN hardware radio set enabled
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1602] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1602] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1602] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1603] manager: Networking is enabled by state file
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1605] settings: Loaded settings plugin: keyfile (internal)
Jan 26 09:00:46 np0005595445.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1613] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1630] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1639] dhcp: init: Using DHCP client 'internal'
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1641] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1652] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1658] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1664] device (lo): Activation: starting connection 'lo' (02fbee44-d9d6-4838-9f7e-bcbf35b7d384)
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1672] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1673] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1694] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1698] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1699] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1700] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1702] device (eth0): carrier: link connected
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1704] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1708] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1712] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1716] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1716] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1718] manager: NetworkManager state is now CONNECTING
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1719] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1723] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1726] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 09:00:46 np0005595445.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 09:00:46 np0005595445.novalocal systemd[1]: Started Network Manager.
Jan 26 09:00:46 np0005595445.novalocal systemd[1]: Reached target Network.
Jan 26 09:00:46 np0005595445.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 26 09:00:46 np0005595445.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 26 09:00:46 np0005595445.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1943] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1946] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 26 09:00:46 np0005595445.novalocal NetworkManager[856]: <info>  [1769418046.1951] device (lo): Activation: successful, device activated.
Jan 26 09:00:46 np0005595445.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Jan 26 09:00:46 np0005595445.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 26 09:00:46 np0005595445.novalocal systemd[1]: Reached target NFS client services.
Jan 26 09:00:46 np0005595445.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Jan 26 09:00:46 np0005595445.novalocal systemd[1]: Reached target Remote File Systems.
Jan 26 09:00:46 np0005595445.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 26 09:00:49 np0005595445.novalocal NetworkManager[856]: <info>  [1769418049.7788] dhcp4 (eth0): state changed new lease, address=38.102.83.217
Jan 26 09:00:49 np0005595445.novalocal NetworkManager[856]: <info>  [1769418049.7799] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 26 09:00:49 np0005595445.novalocal NetworkManager[856]: <info>  [1769418049.7824] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 09:00:49 np0005595445.novalocal NetworkManager[856]: <info>  [1769418049.7860] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 09:00:49 np0005595445.novalocal NetworkManager[856]: <info>  [1769418049.7861] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 09:00:49 np0005595445.novalocal NetworkManager[856]: <info>  [1769418049.7864] manager: NetworkManager state is now CONNECTED_SITE
Jan 26 09:00:49 np0005595445.novalocal NetworkManager[856]: <info>  [1769418049.7866] device (eth0): Activation: successful, device activated.
Jan 26 09:00:49 np0005595445.novalocal NetworkManager[856]: <info>  [1769418049.7871] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 26 09:00:49 np0005595445.novalocal NetworkManager[856]: <info>  [1769418049.7872] manager: startup complete
Jan 26 09:00:49 np0005595445.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 26 09:00:49 np0005595445.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: Cloud-init v. 24.4-8.el9 running 'init' at Mon, 26 Jan 2026 09:00:50 +0000. Up 10.70 seconds.
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: ci-info: ++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: ci-info: +--------+------+-----------------------------+---------------+--------+-------------------+
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: ci-info: | Device |  Up  |           Address           |      Mask     | Scope  |     Hw-Address    |
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: ci-info: +--------+------+-----------------------------+---------------+--------+-------------------+
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: ci-info: |  eth0  | True |        38.102.83.217        | 255.255.255.0 | global | fa:16:3e:37:06:29 |
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: ci-info: |  eth0  | True | fe80::f816:3eff:fe37:629/64 |       .       |  link  | fa:16:3e:37:06:29 |
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: ci-info: |   lo   | True |          127.0.0.1          |   255.0.0.0   |  host  |         .         |
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: ci-info: |   lo   | True |           ::1/128           |       .       |  host  |         .         |
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: ci-info: +--------+------+-----------------------------+---------------+--------+-------------------+
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: ci-info: |   3   |    local    |    ::   |    eth0   |   U   |
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: ci-info: |   4   |  multicast  |    ::   |    eth0   |   U   |
Jan 26 09:00:50 np0005595445.novalocal cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 26 09:00:50 np0005595445.novalocal useradd[988]: new group: name=cloud-user, GID=1001
Jan 26 09:00:50 np0005595445.novalocal useradd[988]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Jan 26 09:00:51 np0005595445.novalocal useradd[988]: add 'cloud-user' to group 'adm'
Jan 26 09:00:51 np0005595445.novalocal useradd[988]: add 'cloud-user' to group 'systemd-journal'
Jan 26 09:00:51 np0005595445.novalocal useradd[988]: add 'cloud-user' to shadow group 'adm'
Jan 26 09:00:51 np0005595445.novalocal useradd[988]: add 'cloud-user' to shadow group 'systemd-journal'
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: Generating public/private rsa key pair.
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: The key fingerprint is:
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: SHA256:91aVOgqkVUYNufwoffI9fg3P2WuF7eXKoMDF99BQgsU root@np0005595445.novalocal
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: The key's randomart image is:
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: +---[RSA 3072]----+
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |          .O* .  |
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |          +.E+  .|
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |         o. o  ..|
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |        +. o o.. |
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |       .S.= =ooo |
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |       . +.*.*+ +|
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |        o ..B o**|
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |         . o + =O|
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |          .   =++|
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: +----[SHA256]-----+
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: Generating public/private ecdsa key pair.
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: The key fingerprint is:
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: SHA256:nJVILKfidZN5/mS6tozdBjlcS6E3CzkyDWlAcgjZf+8 root@np0005595445.novalocal
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: The key's randomart image is:
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: +---[ECDSA 256]---+
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |  .+.o+o..       |
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |  . oo..*. ..    |
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |     . =.=oo .   |
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |    . +.OoB =    |
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |   . o oSO B +   |
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |    .     B =    |
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |         . B     |
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |         +E.o    |
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |        ..=+.    |
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: +----[SHA256]-----+
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: Generating public/private ed25519 key pair.
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: The key fingerprint is:
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: SHA256:dQAzxXqKTk7xH9NGYVPEVgUNRMDrhIkCFBUqOSwhkz4 root@np0005595445.novalocal
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: The key's randomart image is:
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: +--[ED25519 256]--+
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |+. .ooo.++o..BB=+|
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |o+ ...   o..= o .|
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |o = ..   o.+.=   |
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: | E o  o o.+.+    |
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |  .    =So =     |
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |      + o o +    |
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |     =   . +     |
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |      o   .      |
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: |                 |
Jan 26 09:00:51 np0005595445.novalocal cloud-init[920]: +----[SHA256]-----+
Jan 26 09:00:51 np0005595445.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Jan 26 09:00:51 np0005595445.novalocal systemd[1]: Reached target Cloud-config availability.
Jan 26 09:00:51 np0005595445.novalocal systemd[1]: Reached target Network is Online.
Jan 26 09:00:51 np0005595445.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Jan 26 09:00:51 np0005595445.novalocal systemd[1]: Starting Crash recovery kernel arming...
Jan 26 09:00:51 np0005595445.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Jan 26 09:00:51 np0005595445.novalocal systemd[1]: Starting System Logging Service...
Jan 26 09:00:51 np0005595445.novalocal sm-notify[1004]: Version 2.5.4 starting
Jan 26 09:00:51 np0005595445.novalocal systemd[1]: Starting OpenSSH server daemon...
Jan 26 09:00:51 np0005595445.novalocal systemd[1]: Starting Permit User Sessions...
Jan 26 09:00:51 np0005595445.novalocal systemd[1]: Started Notify NFS peers of a restart.
Jan 26 09:00:51 np0005595445.novalocal systemd[1]: Finished Permit User Sessions.
Jan 26 09:00:51 np0005595445.novalocal systemd[1]: Started Command Scheduler.
Jan 26 09:00:51 np0005595445.novalocal systemd[1]: Started Getty on tty1.
Jan 26 09:00:51 np0005595445.novalocal crond[1009]: (CRON) STARTUP (1.5.7)
Jan 26 09:00:51 np0005595445.novalocal systemd[1]: Started Serial Getty on ttyS0.
Jan 26 09:00:51 np0005595445.novalocal crond[1009]: (CRON) INFO (Syslog will be used instead of sendmail.)
Jan 26 09:00:51 np0005595445.novalocal crond[1009]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 36% if used.)
Jan 26 09:00:51 np0005595445.novalocal crond[1009]: (CRON) INFO (running with inotify support)
Jan 26 09:00:51 np0005595445.novalocal systemd[1]: Reached target Login Prompts.
Jan 26 09:00:51 np0005595445.novalocal sshd[1006]: Server listening on 0.0.0.0 port 22.
Jan 26 09:00:51 np0005595445.novalocal sshd[1006]: Server listening on :: port 22.
Jan 26 09:00:51 np0005595445.novalocal systemd[1]: Started OpenSSH server daemon.
Jan 26 09:00:51 np0005595445.novalocal rsyslogd[1005]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1005" x-info="https://www.rsyslog.com"] start
Jan 26 09:00:51 np0005595445.novalocal rsyslogd[1005]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 26 09:00:51 np0005595445.novalocal systemd[1]: Started System Logging Service.
Jan 26 09:00:51 np0005595445.novalocal systemd[1]: Reached target Multi-User System.
Jan 26 09:00:51 np0005595445.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 26 09:00:51 np0005595445.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 26 09:00:51 np0005595445.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 26 09:00:51 np0005595445.novalocal rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 09:00:51 np0005595445.novalocal kdumpctl[1014]: kdump: No kdump initial ramdisk found.
Jan 26 09:00:51 np0005595445.novalocal kdumpctl[1014]: kdump: Rebuilding /boot/initramfs-5.14.0-661.el9.x86_64kdump.img
Jan 26 09:00:51 np0005595445.novalocal cloud-init[1132]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Mon, 26 Jan 2026 09:00:51 +0000. Up 12.45 seconds.
Jan 26 09:00:51 np0005595445.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Jan 26 09:00:51 np0005595445.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Jan 26 09:00:52 np0005595445.novalocal dracut[1265]: dracut-057-102.git20250818.el9
Jan 26 09:00:52 np0005595445.novalocal cloud-init[1283]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Mon, 26 Jan 2026 09:00:52 +0000. Up 12.84 seconds.
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-661.el9.x86_64kdump.img 5.14.0-661.el9.x86_64
Jan 26 09:00:52 np0005595445.novalocal cloud-init[1298]: #############################################################
Jan 26 09:00:52 np0005595445.novalocal cloud-init[1302]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 26 09:00:52 np0005595445.novalocal cloud-init[1307]: 256 SHA256:nJVILKfidZN5/mS6tozdBjlcS6E3CzkyDWlAcgjZf+8 root@np0005595445.novalocal (ECDSA)
Jan 26 09:00:52 np0005595445.novalocal cloud-init[1312]: 256 SHA256:dQAzxXqKTk7xH9NGYVPEVgUNRMDrhIkCFBUqOSwhkz4 root@np0005595445.novalocal (ED25519)
Jan 26 09:00:52 np0005595445.novalocal cloud-init[1319]: 3072 SHA256:91aVOgqkVUYNufwoffI9fg3P2WuF7eXKoMDF99BQgsU root@np0005595445.novalocal (RSA)
Jan 26 09:00:52 np0005595445.novalocal cloud-init[1320]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 26 09:00:52 np0005595445.novalocal cloud-init[1321]: #############################################################
Jan 26 09:00:52 np0005595445.novalocal cloud-init[1283]: Cloud-init v. 24.4-8.el9 finished at Mon, 26 Jan 2026 09:00:52 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 13.02 seconds
Jan 26 09:00:52 np0005595445.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Jan 26 09:00:52 np0005595445.novalocal systemd[1]: Reached target Cloud-init target.
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 26 09:00:52 np0005595445.novalocal dracut[1267]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: Module 'resume' will not be installed, because it's in the list to be omitted!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: memstrack is not available
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: memstrack is not available
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: *** Including module: systemd ***
Jan 26 09:00:53 np0005595445.novalocal dracut[1267]: *** Including module: fips ***
Jan 26 09:00:54 np0005595445.novalocal dracut[1267]: *** Including module: systemd-initrd ***
Jan 26 09:00:54 np0005595445.novalocal dracut[1267]: *** Including module: i18n ***
Jan 26 09:00:54 np0005595445.novalocal dracut[1267]: *** Including module: drm ***
Jan 26 09:00:54 np0005595445.novalocal sshd-session[2174]: Connection reset by 38.102.83.114 port 40006 [preauth]
Jan 26 09:00:54 np0005595445.novalocal sshd-session[2182]: Unable to negotiate with 38.102.83.114 port 40012: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Jan 26 09:00:54 np0005595445.novalocal sshd-session[2190]: Connection reset by 38.102.83.114 port 40026 [preauth]
Jan 26 09:00:54 np0005595445.novalocal sshd-session[2198]: Unable to negotiate with 38.102.83.114 port 40032: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Jan 26 09:00:54 np0005595445.novalocal dracut[1267]: *** Including module: prefixdevname ***
Jan 26 09:00:54 np0005595445.novalocal dracut[1267]: *** Including module: kernel-modules ***
Jan 26 09:00:54 np0005595445.novalocal sshd-session[2206]: Unable to negotiate with 38.102.83.114 port 40046: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Jan 26 09:00:54 np0005595445.novalocal sshd-session[2217]: Connection closed by 38.102.83.114 port 35668 [preauth]
Jan 26 09:00:54 np0005595445.novalocal sshd-session[2227]: Unable to negotiate with 38.102.83.114 port 35690: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Jan 26 09:00:54 np0005595445.novalocal sshd-session[2229]: Unable to negotiate with 38.102.83.114 port 35700: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Jan 26 09:00:54 np0005595445.novalocal sshd-session[2223]: Connection closed by 38.102.83.114 port 35676 [preauth]
Jan 26 09:00:54 np0005595445.novalocal kernel: block vda: the capability attribute has been deprecated.
Jan 26 09:00:54 np0005595445.novalocal chronyd[791]: Selected source 149.56.19.163 (2.centos.pool.ntp.org)
Jan 26 09:00:54 np0005595445.novalocal chronyd[791]: System clock TAI offset set to 37 seconds
Jan 26 09:00:55 np0005595445.novalocal dracut[1267]: *** Including module: kernel-modules-extra ***
Jan 26 09:00:55 np0005595445.novalocal dracut[1267]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Jan 26 09:00:55 np0005595445.novalocal dracut[1267]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Jan 26 09:00:55 np0005595445.novalocal dracut[1267]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Jan 26 09:00:55 np0005595445.novalocal dracut[1267]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Jan 26 09:00:55 np0005595445.novalocal dracut[1267]: *** Including module: qemu ***
Jan 26 09:00:55 np0005595445.novalocal dracut[1267]: *** Including module: fstab-sys ***
Jan 26 09:00:55 np0005595445.novalocal dracut[1267]: *** Including module: rootfs-block ***
Jan 26 09:00:55 np0005595445.novalocal dracut[1267]: *** Including module: terminfo ***
Jan 26 09:00:55 np0005595445.novalocal dracut[1267]: *** Including module: udev-rules ***
Jan 26 09:00:55 np0005595445.novalocal irqbalance[778]: Cannot change IRQ 35 affinity: Operation not permitted
Jan 26 09:00:55 np0005595445.novalocal irqbalance[778]: IRQ 35 affinity is now unmanaged
Jan 26 09:00:55 np0005595445.novalocal irqbalance[778]: Cannot change IRQ 33 affinity: Operation not permitted
Jan 26 09:00:55 np0005595445.novalocal irqbalance[778]: IRQ 33 affinity is now unmanaged
Jan 26 09:00:55 np0005595445.novalocal irqbalance[778]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 26 09:00:55 np0005595445.novalocal irqbalance[778]: IRQ 31 affinity is now unmanaged
Jan 26 09:00:55 np0005595445.novalocal irqbalance[778]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 26 09:00:55 np0005595445.novalocal irqbalance[778]: IRQ 28 affinity is now unmanaged
Jan 26 09:00:55 np0005595445.novalocal irqbalance[778]: Cannot change IRQ 34 affinity: Operation not permitted
Jan 26 09:00:55 np0005595445.novalocal irqbalance[778]: IRQ 34 affinity is now unmanaged
Jan 26 09:00:55 np0005595445.novalocal irqbalance[778]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 26 09:00:55 np0005595445.novalocal irqbalance[778]: IRQ 32 affinity is now unmanaged
Jan 26 09:00:55 np0005595445.novalocal irqbalance[778]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 26 09:00:55 np0005595445.novalocal irqbalance[778]: IRQ 30 affinity is now unmanaged
Jan 26 09:00:55 np0005595445.novalocal irqbalance[778]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 26 09:00:55 np0005595445.novalocal irqbalance[778]: IRQ 29 affinity is now unmanaged
Jan 26 09:00:55 np0005595445.novalocal dracut[1267]: Skipping udev rule: 91-permissions.rules
Jan 26 09:00:55 np0005595445.novalocal dracut[1267]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 26 09:00:55 np0005595445.novalocal dracut[1267]: *** Including module: virtiofs ***
Jan 26 09:00:55 np0005595445.novalocal dracut[1267]: *** Including module: dracut-systemd ***
Jan 26 09:00:55 np0005595445.novalocal dracut[1267]: *** Including module: usrmount ***
Jan 26 09:00:55 np0005595445.novalocal dracut[1267]: *** Including module: base ***
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]: *** Including module: fs-lib ***
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]: *** Including module: kdumpbase ***
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]:   microcode_ctl module: mangling fw_dir
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]:     microcode_ctl: configuration "intel" is ignored
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]: *** Including module: openssl ***
Jan 26 09:00:56 np0005595445.novalocal dracut[1267]: *** Including module: shutdown ***
Jan 26 09:00:57 np0005595445.novalocal dracut[1267]: *** Including module: squash ***
Jan 26 09:00:57 np0005595445.novalocal dracut[1267]: *** Including modules done ***
Jan 26 09:00:57 np0005595445.novalocal dracut[1267]: *** Installing kernel module dependencies ***
Jan 26 09:00:57 np0005595445.novalocal dracut[1267]: *** Installing kernel module dependencies done ***
Jan 26 09:00:57 np0005595445.novalocal dracut[1267]: *** Resolving executable dependencies ***
Jan 26 09:00:59 np0005595445.novalocal dracut[1267]: *** Resolving executable dependencies done ***
Jan 26 09:00:59 np0005595445.novalocal dracut[1267]: *** Generating early-microcode cpio image ***
Jan 26 09:00:59 np0005595445.novalocal dracut[1267]: *** Store current command line parameters ***
Jan 26 09:00:59 np0005595445.novalocal dracut[1267]: Stored kernel commandline:
Jan 26 09:00:59 np0005595445.novalocal dracut[1267]: No dracut internal kernel commandline stored in the initramfs
Jan 26 09:00:59 np0005595445.novalocal dracut[1267]: *** Install squash loader ***
Jan 26 09:00:59 np0005595445.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 09:01:00 np0005595445.novalocal dracut[1267]: *** Squashing the files inside the initramfs ***
Jan 26 09:01:01 np0005595445.novalocal dracut[1267]: *** Squashing the files inside the initramfs done ***
Jan 26 09:01:01 np0005595445.novalocal dracut[1267]: *** Creating image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' ***
Jan 26 09:01:01 np0005595445.novalocal dracut[1267]: *** Hardlinking files ***
Jan 26 09:01:01 np0005595445.novalocal dracut[1267]: Mode:           real
Jan 26 09:01:01 np0005595445.novalocal dracut[1267]: Files:          50
Jan 26 09:01:01 np0005595445.novalocal dracut[1267]: Linked:         0 files
Jan 26 09:01:01 np0005595445.novalocal dracut[1267]: Compared:       0 xattrs
Jan 26 09:01:01 np0005595445.novalocal dracut[1267]: Compared:       0 files
Jan 26 09:01:01 np0005595445.novalocal dracut[1267]: Saved:          0 B
Jan 26 09:01:01 np0005595445.novalocal dracut[1267]: Duration:       0.000883 seconds
Jan 26 09:01:01 np0005595445.novalocal dracut[1267]: *** Hardlinking files done ***
Jan 26 09:01:01 np0005595445.novalocal CROND[4187]: (root) CMD (run-parts /etc/cron.hourly)
Jan 26 09:01:01 np0005595445.novalocal run-parts[4191]: (/etc/cron.hourly) starting 0anacron
Jan 26 09:01:01 np0005595445.novalocal anacron[4199]: Anacron started on 2026-01-26
Jan 26 09:01:01 np0005595445.novalocal anacron[4199]: Will run job `cron.daily' in 18 min.
Jan 26 09:01:01 np0005595445.novalocal anacron[4199]: Will run job `cron.weekly' in 38 min.
Jan 26 09:01:01 np0005595445.novalocal anacron[4199]: Will run job `cron.monthly' in 58 min.
Jan 26 09:01:01 np0005595445.novalocal anacron[4199]: Jobs will be executed sequentially
Jan 26 09:01:01 np0005595445.novalocal run-parts[4201]: (/etc/cron.hourly) finished 0anacron
Jan 26 09:01:01 np0005595445.novalocal CROND[4186]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 26 09:01:01 np0005595445.novalocal dracut[1267]: *** Creating initramfs image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' done ***
Jan 26 09:01:02 np0005595445.novalocal kdumpctl[1014]: kdump: kexec: loaded kdump kernel
Jan 26 09:01:02 np0005595445.novalocal kdumpctl[1014]: kdump: Starting kdump: [OK]
Jan 26 09:01:02 np0005595445.novalocal systemd[1]: Finished Crash recovery kernel arming.
Jan 26 09:01:02 np0005595445.novalocal systemd[1]: Startup finished in 1.551s (kernel) + 2.488s (initrd) + 18.743s (userspace) = 22.783s.
Jan 26 09:01:12 np0005595445.novalocal sshd-session[4316]: Accepted publickey for zuul from 38.102.83.114 port 59554 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Jan 26 09:01:12 np0005595445.novalocal systemd[1]: Created slice User Slice of UID 1000.
Jan 26 09:01:12 np0005595445.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 26 09:01:12 np0005595445.novalocal systemd-logind[783]: New session 1 of user zuul.
Jan 26 09:01:12 np0005595445.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 26 09:01:12 np0005595445.novalocal systemd[1]: Starting User Manager for UID 1000...
Jan 26 09:01:12 np0005595445.novalocal systemd[4320]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:01:12 np0005595445.novalocal systemd[4320]: Queued start job for default target Main User Target.
Jan 26 09:01:12 np0005595445.novalocal systemd[4320]: Created slice User Application Slice.
Jan 26 09:01:12 np0005595445.novalocal systemd[4320]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 26 09:01:12 np0005595445.novalocal systemd[4320]: Started Daily Cleanup of User's Temporary Directories.
Jan 26 09:01:12 np0005595445.novalocal systemd[4320]: Reached target Paths.
Jan 26 09:01:12 np0005595445.novalocal systemd[4320]: Reached target Timers.
Jan 26 09:01:12 np0005595445.novalocal systemd[4320]: Starting D-Bus User Message Bus Socket...
Jan 26 09:01:12 np0005595445.novalocal systemd[4320]: Starting Create User's Volatile Files and Directories...
Jan 26 09:01:12 np0005595445.novalocal systemd[4320]: Finished Create User's Volatile Files and Directories.
Jan 26 09:01:12 np0005595445.novalocal systemd[4320]: Listening on D-Bus User Message Bus Socket.
Jan 26 09:01:12 np0005595445.novalocal systemd[4320]: Reached target Sockets.
Jan 26 09:01:12 np0005595445.novalocal systemd[4320]: Reached target Basic System.
Jan 26 09:01:12 np0005595445.novalocal systemd[4320]: Reached target Main User Target.
Jan 26 09:01:12 np0005595445.novalocal systemd[4320]: Startup finished in 97ms.
Jan 26 09:01:12 np0005595445.novalocal systemd[1]: Started User Manager for UID 1000.
Jan 26 09:01:12 np0005595445.novalocal systemd[1]: Started Session 1 of User zuul.
Jan 26 09:01:12 np0005595445.novalocal sshd-session[4316]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:01:13 np0005595445.novalocal python3[4402]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:01:16 np0005595445.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 09:01:16 np0005595445.novalocal python3[4432]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:01:23 np0005595445.novalocal python3[4490]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:01:24 np0005595445.novalocal python3[4530]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 26 09:01:26 np0005595445.novalocal python3[4556]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDcdeFK+2Uzt/crRIxrw7Ii0Wo86Wha7SQ4BMdscA2exHPGkBWYBIRcQLWh4xXNIJqC/AbzacMgTYrRAOvShuuLUpKyUrfzg1ixfRmJf9fdw2BnSl3RjaKwYMifr2EHSvqhf5bD53uBkC+IdHfTnkuZk6EY16XIhr9eCxuKNHAwKJpnEOyw1gCntHfxFz0wBfy4kv0fT3TjsjCqzDNTzpWx8b5EO9vxnMmoYiZfDcbf2IFeK5LN6O1oAinJsvJV4PpR7ajuvFx5ScMj/FmW42D4VqeCnnHNS5dWt8JHxwY3glRh2xbY1AFfOTDQ7mJSgDV1rY+vTDOxZH3NcovSw7e0hh1Qt3oRYf47AAcmQdH72ljw6N0w34lxQMgBXA4gr6gzREYttTLX3EzRinYa6SypE2Grj5mT9zmv/OvQcULUWVTP443n0NBQIl+NzQqTOwT0s5E1arsVCcgSGTH/tsVlIFM7jffJDMuZorpoMWq/apou6G84JCh7dpggM1MDTCM= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:26 np0005595445.novalocal python3[4580]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:01:27 np0005595445.novalocal python3[4679]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 09:01:27 np0005595445.novalocal python3[4750]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769418087.1324046-252-45805790609404/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=ce1a2ab08f8a45f8bb0154795a55a641_id_rsa follow=False checksum=50f004a61600a842dd3e22b3105eef0d4eef20ff backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:01:28 np0005595445.novalocal python3[4873]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 09:01:28 np0005595445.novalocal python3[4944]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769418088.1256952-307-195949309720022/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=ce1a2ab08f8a45f8bb0154795a55a641_id_rsa.pub follow=False checksum=c08207c95d113b3d2dc53dab777685d45917b3fb backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:01:30 np0005595445.novalocal python3[4992]: ansible-ping Invoked with data=pong
Jan 26 09:01:31 np0005595445.novalocal python3[5016]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:01:33 np0005595445.novalocal python3[5074]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 26 09:01:34 np0005595445.novalocal python3[5106]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:01:34 np0005595445.novalocal python3[5130]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:01:34 np0005595445.novalocal python3[5154]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:01:35 np0005595445.novalocal python3[5178]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:01:35 np0005595445.novalocal python3[5202]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:01:35 np0005595445.novalocal python3[5226]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:01:37 np0005595445.novalocal sudo[5250]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icphyxnfrezbrpdsawvblrivnhdnfbfu ; /usr/bin/python3'
Jan 26 09:01:37 np0005595445.novalocal sudo[5250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:01:37 np0005595445.novalocal python3[5252]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:01:37 np0005595445.novalocal sudo[5250]: pam_unix(sudo:session): session closed for user root
Jan 26 09:01:38 np0005595445.novalocal sudo[5328]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bllrhiharwlzwntwyefktqpsvdrljsxu ; /usr/bin/python3'
Jan 26 09:01:38 np0005595445.novalocal sudo[5328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:01:38 np0005595445.novalocal python3[5330]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 09:01:38 np0005595445.novalocal sudo[5328]: pam_unix(sudo:session): session closed for user root
Jan 26 09:01:38 np0005595445.novalocal sudo[5401]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vytbevikjqqkxiicaourqcvdpwbeiczu ; /usr/bin/python3'
Jan 26 09:01:38 np0005595445.novalocal sudo[5401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:01:38 np0005595445.novalocal python3[5403]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769418097.9074206-32-165934380594243/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:01:38 np0005595445.novalocal sudo[5401]: pam_unix(sudo:session): session closed for user root
Jan 26 09:01:39 np0005595445.novalocal python3[5451]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:39 np0005595445.novalocal python3[5475]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:40 np0005595445.novalocal python3[5499]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:40 np0005595445.novalocal python3[5523]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:40 np0005595445.novalocal python3[5547]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:40 np0005595445.novalocal python3[5571]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:41 np0005595445.novalocal python3[5595]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:41 np0005595445.novalocal python3[5619]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:41 np0005595445.novalocal python3[5643]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:41 np0005595445.novalocal python3[5667]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:42 np0005595445.novalocal python3[5691]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:42 np0005595445.novalocal python3[5715]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:42 np0005595445.novalocal python3[5739]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:42 np0005595445.novalocal python3[5763]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:43 np0005595445.novalocal python3[5787]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:43 np0005595445.novalocal python3[5811]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:43 np0005595445.novalocal python3[5835]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:43 np0005595445.novalocal python3[5859]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:44 np0005595445.novalocal python3[5883]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:44 np0005595445.novalocal python3[5907]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:44 np0005595445.novalocal python3[5931]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:45 np0005595445.novalocal python3[5955]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:45 np0005595445.novalocal python3[5979]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:45 np0005595445.novalocal python3[6003]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:45 np0005595445.novalocal python3[6027]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:46 np0005595445.novalocal python3[6051]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:01:49 np0005595445.novalocal sudo[6075]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wscauveytmhpxuqwqlvjnfyplmbaosww ; /usr/bin/python3'
Jan 26 09:01:49 np0005595445.novalocal sudo[6075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:01:49 np0005595445.novalocal python3[6077]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 26 09:01:49 np0005595445.novalocal systemd[1]: Starting Time & Date Service...
Jan 26 09:01:49 np0005595445.novalocal systemd[1]: Started Time & Date Service.
Jan 26 09:01:49 np0005595445.novalocal systemd-timedated[6079]: Changed time zone to 'UTC' (UTC).
Jan 26 09:01:49 np0005595445.novalocal sudo[6075]: pam_unix(sudo:session): session closed for user root
Jan 26 09:01:49 np0005595445.novalocal sudo[6106]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmawwvwynbpmrcylypknareyynyikizm ; /usr/bin/python3'
Jan 26 09:01:49 np0005595445.novalocal sudo[6106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:01:50 np0005595445.novalocal python3[6108]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:01:50 np0005595445.novalocal sudo[6106]: pam_unix(sudo:session): session closed for user root
Jan 26 09:01:50 np0005595445.novalocal python3[6184]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 09:01:50 np0005595445.novalocal python3[6255]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769418110.2552183-252-55896218177457/source _original_basename=tmp5wnjhfyl follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:01:51 np0005595445.novalocal python3[6355]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 09:01:51 np0005595445.novalocal python3[6426]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769418111.1008055-302-53741488510479/source _original_basename=tmptmds0qw7 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:01:52 np0005595445.novalocal sudo[6526]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qytocidwzpkyumdawreupksygkjsbymr ; /usr/bin/python3'
Jan 26 09:01:52 np0005595445.novalocal sudo[6526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:01:52 np0005595445.novalocal python3[6528]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 09:01:52 np0005595445.novalocal sudo[6526]: pam_unix(sudo:session): session closed for user root
Jan 26 09:01:52 np0005595445.novalocal sudo[6599]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvpkrathyglfdfoantrotueonscmezif ; /usr/bin/python3'
Jan 26 09:01:52 np0005595445.novalocal sudo[6599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:01:52 np0005595445.novalocal python3[6601]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769418112.239112-382-261840777848615/source _original_basename=tmpdxungfil follow=False checksum=f8e7a25c67610e75d05bab7943d515214e034b21 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:01:52 np0005595445.novalocal sudo[6599]: pam_unix(sudo:session): session closed for user root
Jan 26 09:01:53 np0005595445.novalocal python3[6649]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:01:54 np0005595445.novalocal python3[6675]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:01:54 np0005595445.novalocal sudo[6753]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdeiyzgodsbxqzhfobjusyfpnwvzkzfq ; /usr/bin/python3'
Jan 26 09:01:54 np0005595445.novalocal sudo[6753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:01:54 np0005595445.novalocal python3[6755]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 09:01:54 np0005595445.novalocal sudo[6753]: pam_unix(sudo:session): session closed for user root
Jan 26 09:01:54 np0005595445.novalocal sudo[6826]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztmogjmaywtpprtkyiediyzqwmqkycvx ; /usr/bin/python3'
Jan 26 09:01:54 np0005595445.novalocal sudo[6826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:01:54 np0005595445.novalocal python3[6828]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769418114.3463724-452-230568505334353/source _original_basename=tmplnvbqk7o follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:01:54 np0005595445.novalocal sudo[6826]: pam_unix(sudo:session): session closed for user root
Jan 26 09:01:55 np0005595445.novalocal sudo[6877]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjzeckngygoydkimlqvteckiiaajcouj ; /usr/bin/python3'
Jan 26 09:01:55 np0005595445.novalocal sudo[6877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:01:55 np0005595445.novalocal python3[6879]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-0dcc-2282-00000000001f-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:01:55 np0005595445.novalocal sudo[6877]: pam_unix(sudo:session): session closed for user root
Jan 26 09:01:56 np0005595445.novalocal python3[6907]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-0dcc-2282-000000000020-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 26 09:01:57 np0005595445.novalocal python3[6935]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:02:18 np0005595445.novalocal sudo[6959]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqafumlcnmgrlcrhddhiwjibbkesnixw ; /usr/bin/python3'
Jan 26 09:02:18 np0005595445.novalocal sudo[6959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:02:18 np0005595445.novalocal python3[6961]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:02:18 np0005595445.novalocal sudo[6959]: pam_unix(sudo:session): session closed for user root
Jan 26 09:02:19 np0005595445.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 26 09:03:18 np0005595445.novalocal sshd-session[4329]: Received disconnect from 38.102.83.114 port 59554:11: disconnected by user
Jan 26 09:03:18 np0005595445.novalocal sshd-session[4329]: Disconnected from user zuul 38.102.83.114 port 59554
Jan 26 09:03:18 np0005595445.novalocal sshd-session[4316]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:03:18 np0005595445.novalocal systemd-logind[783]: Session 1 logged out. Waiting for processes to exit.
Jan 26 09:03:25 np0005595445.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 26 09:03:25 np0005595445.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 26 09:03:25 np0005595445.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 26 09:03:25 np0005595445.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 26 09:03:25 np0005595445.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 26 09:03:25 np0005595445.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 26 09:03:25 np0005595445.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 26 09:03:25 np0005595445.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 26 09:03:25 np0005595445.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 26 09:03:25 np0005595445.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 26 09:03:25 np0005595445.novalocal NetworkManager[856]: <info>  [1769418205.1987] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 26 09:03:25 np0005595445.novalocal systemd-udevd[6965]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 09:03:25 np0005595445.novalocal NetworkManager[856]: <info>  [1769418205.2140] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 09:03:25 np0005595445.novalocal NetworkManager[856]: <info>  [1769418205.2163] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 26 09:03:25 np0005595445.novalocal NetworkManager[856]: <info>  [1769418205.2165] device (eth1): carrier: link connected
Jan 26 09:03:25 np0005595445.novalocal NetworkManager[856]: <info>  [1769418205.2168] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 26 09:03:25 np0005595445.novalocal NetworkManager[856]: <info>  [1769418205.2173] policy: auto-activating connection 'Wired connection 1' (01428c05-ae5c-3e90-b32b-9a07a8ac9b4b)
Jan 26 09:03:25 np0005595445.novalocal NetworkManager[856]: <info>  [1769418205.2177] device (eth1): Activation: starting connection 'Wired connection 1' (01428c05-ae5c-3e90-b32b-9a07a8ac9b4b)
Jan 26 09:03:25 np0005595445.novalocal NetworkManager[856]: <info>  [1769418205.2178] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 09:03:25 np0005595445.novalocal NetworkManager[856]: <info>  [1769418205.2180] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 09:03:25 np0005595445.novalocal NetworkManager[856]: <info>  [1769418205.2183] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 09:03:25 np0005595445.novalocal NetworkManager[856]: <info>  [1769418205.2187] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 26 09:03:25 np0005595445.novalocal systemd[4320]: Starting Mark boot as successful...
Jan 26 09:03:25 np0005595445.novalocal systemd[4320]: Finished Mark boot as successful.
Jan 26 09:03:25 np0005595445.novalocal sshd-session[6969]: Accepted publickey for zuul from 38.102.83.114 port 53974 ssh2: RSA SHA256:pzGu/8MlhtIDRxsRqlS4AZ6R7CLTQo7Ke10EmY50Qfo
Jan 26 09:03:25 np0005595445.novalocal systemd-logind[783]: New session 3 of user zuul.
Jan 26 09:03:25 np0005595445.novalocal systemd[1]: Started Session 3 of User zuul.
Jan 26 09:03:25 np0005595445.novalocal sshd-session[6969]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:03:26 np0005595445.novalocal python3[6996]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-b722-d7c5-00000000018f-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:03:36 np0005595445.novalocal sudo[7074]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpydhyydqvlmibmxcwijuickiorpiznl ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 26 09:03:36 np0005595445.novalocal sudo[7074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:03:36 np0005595445.novalocal python3[7076]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 09:03:36 np0005595445.novalocal sudo[7074]: pam_unix(sudo:session): session closed for user root
Jan 26 09:03:36 np0005595445.novalocal sudo[7147]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvbosfhmgnvxejvmcgvsyodmxvueewvu ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 26 09:03:36 np0005595445.novalocal sudo[7147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:03:36 np0005595445.novalocal python3[7149]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769418215.87125-155-214241887746077/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=6ef8a714238b6321d3120b756c9e2e82d4276bd3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:03:36 np0005595445.novalocal sudo[7147]: pam_unix(sudo:session): session closed for user root
Jan 26 09:03:36 np0005595445.novalocal sudo[7197]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjsyiykefcaxbdkuggyhgerxeidqswdm ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 26 09:03:36 np0005595445.novalocal sudo[7197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:03:36 np0005595445.novalocal python3[7199]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 09:03:36 np0005595445.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 26 09:03:36 np0005595445.novalocal systemd[1]: Stopped Network Manager Wait Online.
Jan 26 09:03:36 np0005595445.novalocal systemd[1]: Stopping Network Manager Wait Online...
Jan 26 09:03:36 np0005595445.novalocal systemd[1]: Stopping Network Manager...
Jan 26 09:03:36 np0005595445.novalocal NetworkManager[856]: <info>  [1769418216.9605] caught SIGTERM, shutting down normally.
Jan 26 09:03:36 np0005595445.novalocal NetworkManager[856]: <info>  [1769418216.9616] dhcp4 (eth0): canceled DHCP transaction
Jan 26 09:03:36 np0005595445.novalocal NetworkManager[856]: <info>  [1769418216.9617] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 09:03:36 np0005595445.novalocal NetworkManager[856]: <info>  [1769418216.9617] dhcp4 (eth0): state changed no lease
Jan 26 09:03:36 np0005595445.novalocal NetworkManager[856]: <info>  [1769418216.9619] manager: NetworkManager state is now CONNECTING
Jan 26 09:03:36 np0005595445.novalocal NetworkManager[856]: <info>  [1769418216.9707] dhcp4 (eth1): canceled DHCP transaction
Jan 26 09:03:36 np0005595445.novalocal NetworkManager[856]: <info>  [1769418216.9707] dhcp4 (eth1): state changed no lease
Jan 26 09:03:36 np0005595445.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 09:03:36 np0005595445.novalocal NetworkManager[856]: <info>  [1769418216.9775] exiting (success)
Jan 26 09:03:36 np0005595445.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 09:03:36 np0005595445.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 26 09:03:36 np0005595445.novalocal systemd[1]: Stopped Network Manager.
Jan 26 09:03:36 np0005595445.novalocal systemd[1]: NetworkManager.service: Consumed 1.036s CPU time, 9.9M memory peak.
Jan 26 09:03:36 np0005595445.novalocal systemd[1]: Starting Network Manager...
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.0195] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:ddaf00d4-1dc5-4d7f-b5ab-626f2ab79e8a)
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.0196] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.0253] manager[0x558bef871000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 26 09:03:37 np0005595445.novalocal systemd[1]: Starting Hostname Service...
Jan 26 09:03:37 np0005595445.novalocal systemd[1]: Started Hostname Service.
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.0941] hostname: hostname: using hostnamed
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.0941] hostname: static hostname changed from (none) to "np0005595445.novalocal"
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.0947] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.0952] manager[0x558bef871000]: rfkill: Wi-Fi hardware radio set enabled
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.0953] manager[0x558bef871000]: rfkill: WWAN hardware radio set enabled
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.0977] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.0977] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.0978] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.0978] manager: Networking is enabled by state file
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.0980] settings: Loaded settings plugin: keyfile (internal)
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.0983] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1003] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1011] dhcp: init: Using DHCP client 'internal'
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1013] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1017] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1021] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1027] device (lo): Activation: starting connection 'lo' (02fbee44-d9d6-4838-9f7e-bcbf35b7d384)
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1032] device (eth0): carrier: link connected
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1036] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1040] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1040] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1045] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1051] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1055] device (eth1): carrier: link connected
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1058] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1062] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (01428c05-ae5c-3e90-b32b-9a07a8ac9b4b) (indicated)
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1062] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1066] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1071] device (eth1): Activation: starting connection 'Wired connection 1' (01428c05-ae5c-3e90-b32b-9a07a8ac9b4b)
Jan 26 09:03:37 np0005595445.novalocal systemd[1]: Started Network Manager.
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1077] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1081] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1082] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1083] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1085] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1087] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1089] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1090] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1092] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1097] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1099] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1105] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1107] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1128] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1130] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1134] device (lo): Activation: successful, device activated.
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1140] dhcp4 (eth0): state changed new lease, address=38.102.83.217
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1145] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1200] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1236] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1238] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1240] manager: NetworkManager state is now CONNECTED_SITE
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1241] device (eth0): Activation: successful, device activated.
Jan 26 09:03:37 np0005595445.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 26 09:03:37 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418217.1245] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 26 09:03:37 np0005595445.novalocal sudo[7197]: pam_unix(sudo:session): session closed for user root
Jan 26 09:03:37 np0005595445.novalocal python3[7283]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-b722-d7c5-0000000000c8-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:03:47 np0005595445.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 09:04:07 np0005595445.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 09:04:22 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418262.3710] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 26 09:04:22 np0005595445.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 09:04:22 np0005595445.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 09:04:22 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418262.3925] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 26 09:04:22 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418262.3929] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 26 09:04:22 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418262.3941] device (eth1): Activation: successful, device activated.
Jan 26 09:04:22 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418262.3949] manager: startup complete
Jan 26 09:04:22 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418262.3952] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 26 09:04:22 np0005595445.novalocal NetworkManager[7211]: <warn>  [1769418262.3962] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 26 09:04:22 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418262.3972] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 26 09:04:22 np0005595445.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 26 09:04:22 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418262.4067] dhcp4 (eth1): canceled DHCP transaction
Jan 26 09:04:22 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418262.4068] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 26 09:04:22 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418262.4068] dhcp4 (eth1): state changed no lease
Jan 26 09:04:22 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418262.4084] policy: auto-activating connection 'ci-private-network' (005e0c67-cd91-5f06-b6a9-1e083d81271f)
Jan 26 09:04:22 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418262.4089] device (eth1): Activation: starting connection 'ci-private-network' (005e0c67-cd91-5f06-b6a9-1e083d81271f)
Jan 26 09:04:22 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418262.4090] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 09:04:22 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418262.4095] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 09:04:22 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418262.4103] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 09:04:22 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418262.4115] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 09:04:22 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418262.4147] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 09:04:22 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418262.4149] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 09:04:22 np0005595445.novalocal NetworkManager[7211]: <info>  [1769418262.4156] device (eth1): Activation: successful, device activated.
Jan 26 09:04:32 np0005595445.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 09:04:37 np0005595445.novalocal sshd-session[6972]: Received disconnect from 38.102.83.114 port 53974:11: disconnected by user
Jan 26 09:04:37 np0005595445.novalocal sshd-session[6972]: Disconnected from user zuul 38.102.83.114 port 53974
Jan 26 09:04:37 np0005595445.novalocal sshd-session[6969]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:04:37 np0005595445.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Jan 26 09:04:37 np0005595445.novalocal systemd[1]: session-3.scope: Consumed 1.298s CPU time.
Jan 26 09:04:37 np0005595445.novalocal systemd-logind[783]: Session 3 logged out. Waiting for processes to exit.
Jan 26 09:04:37 np0005595445.novalocal systemd-logind[783]: Removed session 3.
Jan 26 09:05:18 np0005595445.novalocal sshd-session[7311]: Accepted publickey for zuul from 38.102.83.114 port 45838 ssh2: RSA SHA256:pzGu/8MlhtIDRxsRqlS4AZ6R7CLTQo7Ke10EmY50Qfo
Jan 26 09:05:18 np0005595445.novalocal systemd-logind[783]: New session 4 of user zuul.
Jan 26 09:05:18 np0005595445.novalocal systemd[1]: Started Session 4 of User zuul.
Jan 26 09:05:18 np0005595445.novalocal sshd-session[7311]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:05:18 np0005595445.novalocal sudo[7390]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afcyzedkmfoybcqrcbpjtlaliflikxxw ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 26 09:05:18 np0005595445.novalocal sudo[7390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:05:18 np0005595445.novalocal python3[7392]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 09:05:18 np0005595445.novalocal sudo[7390]: pam_unix(sudo:session): session closed for user root
Jan 26 09:05:18 np0005595445.novalocal sudo[7463]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocfyavwrxeltpylkmnzczxlwlqiuprzr ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 26 09:05:18 np0005595445.novalocal sudo[7463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:05:18 np0005595445.novalocal python3[7465]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769418318.361261-373-179846650525927/source _original_basename=tmpr0ymogg4 follow=False checksum=10e957977f1ea6bc363530e08bbb212998c5f9af backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:05:18 np0005595445.novalocal sudo[7463]: pam_unix(sudo:session): session closed for user root
Jan 26 09:05:22 np0005595445.novalocal sshd-session[7314]: Connection closed by 38.102.83.114 port 45838
Jan 26 09:05:22 np0005595445.novalocal sshd-session[7311]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:05:22 np0005595445.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Jan 26 09:05:22 np0005595445.novalocal systemd-logind[783]: Session 4 logged out. Waiting for processes to exit.
Jan 26 09:05:22 np0005595445.novalocal systemd-logind[783]: Removed session 4.
Jan 26 09:07:10 np0005595445.novalocal systemd[4320]: Created slice User Background Tasks Slice.
Jan 26 09:07:10 np0005595445.novalocal systemd[4320]: Starting Cleanup of User's Temporary Files and Directories...
Jan 26 09:07:10 np0005595445.novalocal systemd[4320]: Finished Cleanup of User's Temporary Files and Directories.
Jan 26 09:12:21 np0005595445.novalocal sshd-session[7497]: Accepted publickey for zuul from 38.102.83.114 port 46466 ssh2: RSA SHA256:pzGu/8MlhtIDRxsRqlS4AZ6R7CLTQo7Ke10EmY50Qfo
Jan 26 09:12:21 np0005595445.novalocal systemd-logind[783]: New session 5 of user zuul.
Jan 26 09:12:21 np0005595445.novalocal systemd[1]: Started Session 5 of User zuul.
Jan 26 09:12:21 np0005595445.novalocal sshd-session[7497]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:12:21 np0005595445.novalocal sudo[7524]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfthfmiiookyohovesvktonavdbwcwev ; /usr/bin/python3'
Jan 26 09:12:21 np0005595445.novalocal sudo[7524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:12:21 np0005595445.novalocal python3[7526]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-680e-3fc1-00000000217d-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:12:21 np0005595445.novalocal sudo[7524]: pam_unix(sudo:session): session closed for user root
Jan 26 09:12:22 np0005595445.novalocal sudo[7553]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcoizrszfjwecmjayifkmflbgigcpjpa ; /usr/bin/python3'
Jan 26 09:12:22 np0005595445.novalocal sudo[7553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:12:22 np0005595445.novalocal python3[7555]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:12:22 np0005595445.novalocal sudo[7553]: pam_unix(sudo:session): session closed for user root
Jan 26 09:12:22 np0005595445.novalocal sudo[7579]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bujvfzampuclsvewiensuaajabudnteh ; /usr/bin/python3'
Jan 26 09:12:22 np0005595445.novalocal sudo[7579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:12:23 np0005595445.novalocal python3[7581]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:12:23 np0005595445.novalocal sudo[7579]: pam_unix(sudo:session): session closed for user root
Jan 26 09:12:23 np0005595445.novalocal sudo[7605]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxwbtqpvdlbwgeoteillywvyttouaxbn ; /usr/bin/python3'
Jan 26 09:12:23 np0005595445.novalocal sudo[7605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:12:23 np0005595445.novalocal python3[7607]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:12:23 np0005595445.novalocal sudo[7605]: pam_unix(sudo:session): session closed for user root
Jan 26 09:12:23 np0005595445.novalocal sudo[7631]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfgcnhabjdiaccrutuiaozlhskzytdbu ; /usr/bin/python3'
Jan 26 09:12:23 np0005595445.novalocal sudo[7631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:12:23 np0005595445.novalocal python3[7633]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:12:23 np0005595445.novalocal sudo[7631]: pam_unix(sudo:session): session closed for user root
Jan 26 09:12:24 np0005595445.novalocal sudo[7657]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fictikbqcksujeobaezxndohuzwmuhta ; /usr/bin/python3'
Jan 26 09:12:24 np0005595445.novalocal sudo[7657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:12:24 np0005595445.novalocal python3[7659]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:12:24 np0005595445.novalocal sudo[7657]: pam_unix(sudo:session): session closed for user root
Jan 26 09:12:25 np0005595445.novalocal sudo[7735]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlfgijlivnbwmcnlnklbwgvkiezbvgfq ; /usr/bin/python3'
Jan 26 09:12:25 np0005595445.novalocal sudo[7735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:12:25 np0005595445.novalocal python3[7737]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 09:12:25 np0005595445.novalocal sudo[7735]: pam_unix(sudo:session): session closed for user root
Jan 26 09:12:25 np0005595445.novalocal sudo[7808]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzeazzqgpsnijufceeztezogrquzkrcw ; /usr/bin/python3'
Jan 26 09:12:25 np0005595445.novalocal sudo[7808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:12:25 np0005595445.novalocal python3[7810]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769418745.0358005-538-68336395017277/source _original_basename=tmpfgmn5bfn follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:12:25 np0005595445.novalocal sudo[7808]: pam_unix(sudo:session): session closed for user root
Jan 26 09:12:26 np0005595445.novalocal sudo[7858]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxbaufpxnudvdgovzuqdbzujybdoltha ; /usr/bin/python3'
Jan 26 09:12:26 np0005595445.novalocal sudo[7858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:12:27 np0005595445.novalocal python3[7860]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 09:12:27 np0005595445.novalocal systemd[1]: Reloading.
Jan 26 09:12:27 np0005595445.novalocal systemd-rc-local-generator[7878]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:12:27 np0005595445.novalocal sudo[7858]: pam_unix(sudo:session): session closed for user root
Jan 26 09:12:28 np0005595445.novalocal sudo[7914]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaxtiawnitltfgavkoejnipgwrwqegdv ; /usr/bin/python3'
Jan 26 09:12:28 np0005595445.novalocal sudo[7914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:12:28 np0005595445.novalocal python3[7916]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 26 09:12:28 np0005595445.novalocal sudo[7914]: pam_unix(sudo:session): session closed for user root
Jan 26 09:12:29 np0005595445.novalocal sudo[7940]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udlwcajbwupgteasfebxzzoppuvligju ; /usr/bin/python3'
Jan 26 09:12:29 np0005595445.novalocal sudo[7940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:12:29 np0005595445.novalocal python3[7942]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:12:29 np0005595445.novalocal sudo[7940]: pam_unix(sudo:session): session closed for user root
Jan 26 09:12:29 np0005595445.novalocal sudo[7968]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtefnclwkjdugjpzqxtuoguiywqnmfio ; /usr/bin/python3'
Jan 26 09:12:29 np0005595445.novalocal sudo[7968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:12:29 np0005595445.novalocal python3[7970]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:12:29 np0005595445.novalocal sudo[7968]: pam_unix(sudo:session): session closed for user root
Jan 26 09:12:29 np0005595445.novalocal sudo[7996]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxvjwxxrbfuyrsuklpfqsqqxnylkllpj ; /usr/bin/python3'
Jan 26 09:12:29 np0005595445.novalocal sudo[7996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:12:29 np0005595445.novalocal python3[7998]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:12:29 np0005595445.novalocal sudo[7996]: pam_unix(sudo:session): session closed for user root
Jan 26 09:12:30 np0005595445.novalocal sudo[8024]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfubrgnlxhxezhfcjtdaxsvjlymirwip ; /usr/bin/python3'
Jan 26 09:12:30 np0005595445.novalocal sudo[8024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:12:30 np0005595445.novalocal python3[8026]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:12:30 np0005595445.novalocal sudo[8024]: pam_unix(sudo:session): session closed for user root
Jan 26 09:12:31 np0005595445.novalocal python3[8053]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-680e-3fc1-000000002184-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:12:31 np0005595445.novalocal python3[8083]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 26 09:12:34 np0005595445.novalocal sshd-session[7500]: Connection closed by 38.102.83.114 port 46466
Jan 26 09:12:34 np0005595445.novalocal sshd-session[7497]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:12:34 np0005595445.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Jan 26 09:12:34 np0005595445.novalocal systemd[1]: session-5.scope: Consumed 3.868s CPU time.
Jan 26 09:12:34 np0005595445.novalocal systemd-logind[783]: Session 5 logged out. Waiting for processes to exit.
Jan 26 09:12:34 np0005595445.novalocal systemd-logind[783]: Removed session 5.
Jan 26 09:12:36 np0005595445.novalocal sshd-session[8088]: Accepted publickey for zuul from 38.102.83.114 port 50708 ssh2: RSA SHA256:pzGu/8MlhtIDRxsRqlS4AZ6R7CLTQo7Ke10EmY50Qfo
Jan 26 09:12:36 np0005595445.novalocal systemd-logind[783]: New session 6 of user zuul.
Jan 26 09:12:36 np0005595445.novalocal systemd[1]: Started Session 6 of User zuul.
Jan 26 09:12:36 np0005595445.novalocal sshd-session[8088]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:12:36 np0005595445.novalocal sudo[8115]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wozljbyzaznomqhshjixhmutlqsihhkw ; /usr/bin/python3'
Jan 26 09:12:36 np0005595445.novalocal sudo[8115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:12:36 np0005595445.novalocal python3[8117]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 26 09:12:43 np0005595445.novalocal setsebool[8160]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 26 09:12:43 np0005595445.novalocal setsebool[8160]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 26 09:12:57 np0005595445.novalocal kernel: SELinux:  Converting 386 SID table entries...
Jan 26 09:12:57 np0005595445.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 09:12:57 np0005595445.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 26 09:12:57 np0005595445.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 09:12:57 np0005595445.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 26 09:12:57 np0005595445.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 09:12:57 np0005595445.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 09:12:57 np0005595445.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 09:13:08 np0005595445.novalocal kernel: SELinux:  Converting 389 SID table entries...
Jan 26 09:13:08 np0005595445.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 09:13:08 np0005595445.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 26 09:13:08 np0005595445.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 09:13:08 np0005595445.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 26 09:13:08 np0005595445.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 09:13:08 np0005595445.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 09:13:08 np0005595445.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 09:13:26 np0005595445.novalocal dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 26 09:13:26 np0005595445.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 09:13:26 np0005595445.novalocal systemd[1]: Starting man-db-cache-update.service...
Jan 26 09:13:26 np0005595445.novalocal systemd[1]: Reloading.
Jan 26 09:13:26 np0005595445.novalocal systemd-rc-local-generator[8931]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:13:27 np0005595445.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 09:13:28 np0005595445.novalocal sudo[8115]: pam_unix(sudo:session): session closed for user root
Jan 26 09:13:33 np0005595445.novalocal python3[14013]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-9270-47f7-00000000000c-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:13:34 np0005595445.novalocal kernel: evm: overlay not supported
Jan 26 09:13:34 np0005595445.novalocal systemd[4320]: Starting D-Bus User Message Bus...
Jan 26 09:13:34 np0005595445.novalocal dbus-broker-launch[14550]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 26 09:13:34 np0005595445.novalocal dbus-broker-launch[14550]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 26 09:13:34 np0005595445.novalocal systemd[4320]: Started D-Bus User Message Bus.
Jan 26 09:13:34 np0005595445.novalocal dbus-broker-lau[14550]: Ready
Jan 26 09:13:34 np0005595445.novalocal systemd[4320]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 26 09:13:34 np0005595445.novalocal systemd[4320]: Created slice Slice /user.
Jan 26 09:13:34 np0005595445.novalocal systemd[4320]: podman-14476.scope: unit configures an IP firewall, but not running as root.
Jan 26 09:13:34 np0005595445.novalocal systemd[4320]: (This warning is only shown for the first unit using IP firewalling.)
Jan 26 09:13:34 np0005595445.novalocal systemd[4320]: Started podman-14476.scope.
Jan 26 09:13:34 np0005595445.novalocal systemd[4320]: Started podman-pause-07735550.scope.
Jan 26 09:13:35 np0005595445.novalocal sshd-session[8091]: Connection closed by 38.102.83.114 port 50708
Jan 26 09:13:35 np0005595445.novalocal sshd-session[8088]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:13:35 np0005595445.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Jan 26 09:13:35 np0005595445.novalocal systemd[1]: session-6.scope: Consumed 47.857s CPU time.
Jan 26 09:13:35 np0005595445.novalocal systemd-logind[783]: Session 6 logged out. Waiting for processes to exit.
Jan 26 09:13:35 np0005595445.novalocal systemd-logind[783]: Removed session 6.
Jan 26 09:13:49 np0005595445.novalocal sshd-session[21484]: Connection closed by 38.102.83.222 port 35202 [preauth]
Jan 26 09:13:50 np0005595445.novalocal sshd-session[21486]: Connection closed by 38.102.83.222 port 35212 [preauth]
Jan 26 09:13:50 np0005595445.novalocal sshd-session[21490]: Unable to negotiate with 38.102.83.222 port 35220: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Jan 26 09:13:50 np0005595445.novalocal sshd-session[21485]: Unable to negotiate with 38.102.83.222 port 35230: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Jan 26 09:13:50 np0005595445.novalocal sshd-session[21488]: Unable to negotiate with 38.102.83.222 port 35232: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Jan 26 09:13:54 np0005595445.novalocal sshd-session[23658]: Accepted publickey for zuul from 38.102.83.114 port 43342 ssh2: RSA SHA256:pzGu/8MlhtIDRxsRqlS4AZ6R7CLTQo7Ke10EmY50Qfo
Jan 26 09:13:54 np0005595445.novalocal systemd-logind[783]: New session 7 of user zuul.
Jan 26 09:13:54 np0005595445.novalocal systemd[1]: Started Session 7 of User zuul.
Jan 26 09:13:55 np0005595445.novalocal sshd-session[23658]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:13:55 np0005595445.novalocal python3[23767]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAtPEUMhsrPMrPBtrW9DN2hphJag2++Oa2+RW/bLZtOuHDUx3O2VLMTPtjIlKZnSfgaaLhnryF0u4SgeMfgHgP8= zuul@np0005595443.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:13:55 np0005595445.novalocal sudo[23957]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibsrhdxjrmdpplxtmpgziifoozkdqvdc ; /usr/bin/python3'
Jan 26 09:13:55 np0005595445.novalocal sudo[23957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:13:55 np0005595445.novalocal python3[23969]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAtPEUMhsrPMrPBtrW9DN2hphJag2++Oa2+RW/bLZtOuHDUx3O2VLMTPtjIlKZnSfgaaLhnryF0u4SgeMfgHgP8= zuul@np0005595443.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:13:55 np0005595445.novalocal sudo[23957]: pam_unix(sudo:session): session closed for user root
Jan 26 09:13:56 np0005595445.novalocal sudo[24419]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byakhljqnjukotvjxekhkneesuwoatun ; /usr/bin/python3'
Jan 26 09:13:56 np0005595445.novalocal sudo[24419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:13:56 np0005595445.novalocal python3[24429]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005595445.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 26 09:13:56 np0005595445.novalocal useradd[24510]: new group: name=cloud-admin, GID=1002
Jan 26 09:13:56 np0005595445.novalocal useradd[24510]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Jan 26 09:13:56 np0005595445.novalocal sudo[24419]: pam_unix(sudo:session): session closed for user root
Jan 26 09:13:56 np0005595445.novalocal sudo[24639]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xoujchwswttzysjzemqslvgnqshzgkdp ; /usr/bin/python3'
Jan 26 09:13:56 np0005595445.novalocal sudo[24639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:13:57 np0005595445.novalocal python3[24649]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAtPEUMhsrPMrPBtrW9DN2hphJag2++Oa2+RW/bLZtOuHDUx3O2VLMTPtjIlKZnSfgaaLhnryF0u4SgeMfgHgP8= zuul@np0005595443.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 09:13:57 np0005595445.novalocal sudo[24639]: pam_unix(sudo:session): session closed for user root
Jan 26 09:13:57 np0005595445.novalocal sudo[24920]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdozkawnnlwlqsdynphpdzqqnnfkonzq ; /usr/bin/python3'
Jan 26 09:13:57 np0005595445.novalocal sudo[24920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:13:57 np0005595445.novalocal python3[24928]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 09:13:57 np0005595445.novalocal sudo[24920]: pam_unix(sudo:session): session closed for user root
Jan 26 09:13:57 np0005595445.novalocal sudo[25178]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugppoegchoimxhxrkskbpenpojsbmxtf ; /usr/bin/python3'
Jan 26 09:13:57 np0005595445.novalocal sudo[25178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:13:58 np0005595445.novalocal python3[25187]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769418837.356957-151-20203635944896/source _original_basename=tmpy_kqy21s follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:13:58 np0005595445.novalocal sudo[25178]: pam_unix(sudo:session): session closed for user root
Jan 26 09:13:58 np0005595445.novalocal sudo[25538]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emvrjscwmpeobyufdshbslozeenomugx ; /usr/bin/python3'
Jan 26 09:13:58 np0005595445.novalocal sudo[25538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:13:58 np0005595445.novalocal python3[25547]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Jan 26 09:13:58 np0005595445.novalocal systemd[1]: Starting Hostname Service...
Jan 26 09:13:58 np0005595445.novalocal systemd[1]: Started Hostname Service.
Jan 26 09:13:59 np0005595445.novalocal systemd-hostnamed[25675]: Changed pretty hostname to 'compute-1'
Jan 26 09:13:59 compute-1 systemd-hostnamed[25675]: Hostname set to <compute-1> (static)
Jan 26 09:13:59 compute-1 NetworkManager[7211]: <info>  [1769418839.0098] hostname: static hostname changed from "np0005595445.novalocal" to "compute-1"
Jan 26 09:13:59 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 09:13:59 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 09:13:59 compute-1 sudo[25538]: pam_unix(sudo:session): session closed for user root
Jan 26 09:13:59 compute-1 sshd-session[23711]: Connection closed by 38.102.83.114 port 43342
Jan 26 09:13:59 compute-1 sshd-session[23658]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:13:59 compute-1 systemd-logind[783]: Session 7 logged out. Waiting for processes to exit.
Jan 26 09:13:59 compute-1 systemd[1]: session-7.scope: Deactivated successfully.
Jan 26 09:13:59 compute-1 systemd[1]: session-7.scope: Consumed 2.258s CPU time.
Jan 26 09:13:59 compute-1 systemd-logind[783]: Removed session 7.
Jan 26 09:14:09 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 09:14:09 compute-1 sshd-session[29687]: Connection closed by 152.42.136.110 port 45224
Jan 26 09:14:10 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 09:14:10 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 09:14:10 compute-1 systemd[1]: man-db-cache-update.service: Consumed 51.077s CPU time.
Jan 26 09:14:10 compute-1 systemd[1]: run-r369e89110e9a474b95e0da52deae43cf.service: Deactivated successfully.
Jan 26 09:14:29 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 09:16:10 compute-1 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 26 09:16:10 compute-1 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 26 09:16:10 compute-1 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 26 09:16:10 compute-1 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 26 09:18:03 compute-1 sshd-session[29926]: Accepted publickey for zuul from 38.102.83.222 port 47946 ssh2: RSA SHA256:pzGu/8MlhtIDRxsRqlS4AZ6R7CLTQo7Ke10EmY50Qfo
Jan 26 09:18:03 compute-1 systemd-logind[783]: New session 8 of user zuul.
Jan 26 09:18:03 compute-1 systemd[1]: Started Session 8 of User zuul.
Jan 26 09:18:03 compute-1 sshd-session[29926]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:18:04 compute-1 python3[30002]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:18:06 compute-1 sudo[30117]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lykeovtgjzxqefbtwebvpxixqrtjeohg ; /usr/bin/python3'
Jan 26 09:18:06 compute-1 sshd-session[30041]: Connection closed by 178.62.249.31 port 59450
Jan 26 09:18:06 compute-1 sudo[30117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:18:06 compute-1 python3[30119]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 09:18:06 compute-1 sudo[30117]: pam_unix(sudo:session): session closed for user root
Jan 26 09:18:06 compute-1 sudo[30190]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfgwmlogpldwmvfkvtjwtlwvxptenmsu ; /usr/bin/python3'
Jan 26 09:18:06 compute-1 sudo[30190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:18:06 compute-1 python3[30192]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769419085.905869-33987-71487597389602/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:18:06 compute-1 sudo[30190]: pam_unix(sudo:session): session closed for user root
Jan 26 09:18:06 compute-1 sudo[30216]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmmdgaxhqgbodtfrsgkatylpqewfgiij ; /usr/bin/python3'
Jan 26 09:18:06 compute-1 sudo[30216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:18:06 compute-1 python3[30218]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 09:18:06 compute-1 sudo[30216]: pam_unix(sudo:session): session closed for user root
Jan 26 09:18:07 compute-1 sudo[30289]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhtukmigzhgpnvpieanaaxwelxiotyzk ; /usr/bin/python3'
Jan 26 09:18:07 compute-1 sudo[30289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:18:07 compute-1 python3[30291]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769419085.905869-33987-71487597389602/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:18:07 compute-1 sudo[30289]: pam_unix(sudo:session): session closed for user root
Jan 26 09:18:07 compute-1 sudo[30315]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwzajvigrvkzsffwtdxepvthatijolno ; /usr/bin/python3'
Jan 26 09:18:07 compute-1 sudo[30315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:18:07 compute-1 python3[30317]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 09:18:07 compute-1 sudo[30315]: pam_unix(sudo:session): session closed for user root
Jan 26 09:18:07 compute-1 sudo[30388]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mikfscejfsjappfmtxyztaqsmfyjexml ; /usr/bin/python3'
Jan 26 09:18:07 compute-1 sudo[30388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:18:07 compute-1 python3[30390]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769419085.905869-33987-71487597389602/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:18:07 compute-1 sudo[30388]: pam_unix(sudo:session): session closed for user root
Jan 26 09:18:08 compute-1 sudo[30414]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmwtasqngiiqiufuqpgkwmmkzbqyyjpw ; /usr/bin/python3'
Jan 26 09:18:08 compute-1 sudo[30414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:18:08 compute-1 python3[30416]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 09:18:08 compute-1 sudo[30414]: pam_unix(sudo:session): session closed for user root
Jan 26 09:18:08 compute-1 sudo[30487]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbfxidzbnuixnjsbpiopeqexiugphibt ; /usr/bin/python3'
Jan 26 09:18:08 compute-1 sudo[30487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:18:08 compute-1 python3[30489]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769419085.905869-33987-71487597389602/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:18:08 compute-1 sudo[30487]: pam_unix(sudo:session): session closed for user root
Jan 26 09:18:08 compute-1 sudo[30513]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wodnbfzonhlsfjmsdgqykvhgmnfhvfhb ; /usr/bin/python3'
Jan 26 09:18:08 compute-1 sudo[30513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:18:08 compute-1 python3[30515]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 09:18:08 compute-1 sudo[30513]: pam_unix(sudo:session): session closed for user root
Jan 26 09:18:08 compute-1 sudo[30586]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avcrrbubyqvbbaxeuwoyxuxrhswoaqse ; /usr/bin/python3'
Jan 26 09:18:08 compute-1 sudo[30586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:18:09 compute-1 python3[30588]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769419085.905869-33987-71487597389602/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:18:09 compute-1 sudo[30586]: pam_unix(sudo:session): session closed for user root
Jan 26 09:18:09 compute-1 sudo[30612]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejlwceugzomujyprkdddamrpcevyybtu ; /usr/bin/python3'
Jan 26 09:18:09 compute-1 sudo[30612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:18:09 compute-1 python3[30614]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 09:18:09 compute-1 sudo[30612]: pam_unix(sudo:session): session closed for user root
Jan 26 09:18:09 compute-1 sudo[30685]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctuhalgxqvpubhcixmpzmwwddtdqazwk ; /usr/bin/python3'
Jan 26 09:18:09 compute-1 sudo[30685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:18:09 compute-1 python3[30687]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769419085.905869-33987-71487597389602/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:18:09 compute-1 sudo[30685]: pam_unix(sudo:session): session closed for user root
Jan 26 09:18:09 compute-1 sudo[30711]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onfsxcpfyzvteiigxzysngxtpyrvdhxz ; /usr/bin/python3'
Jan 26 09:18:09 compute-1 sudo[30711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:18:09 compute-1 python3[30713]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 09:18:09 compute-1 sudo[30711]: pam_unix(sudo:session): session closed for user root
Jan 26 09:18:10 compute-1 sudo[30784]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucnsdyfwbejhulgwzzsqbfnkiqenwkyv ; /usr/bin/python3'
Jan 26 09:18:10 compute-1 sudo[30784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:18:10 compute-1 python3[30786]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769419085.905869-33987-71487597389602/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:18:10 compute-1 sudo[30784]: pam_unix(sudo:session): session closed for user root
Jan 26 09:18:23 compute-1 python3[30834]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:18:52 compute-1 sshd-session[30836]: Connection closed by 104.248.198.71 port 35394
Jan 26 09:19:01 compute-1 anacron[4199]: Job `cron.daily' started
Jan 26 09:19:01 compute-1 anacron[4199]: Job `cron.daily' terminated
Jan 26 09:19:24 compute-1 sshd-session[30839]: Connection closed by authenticating user root 178.62.249.31 port 38566 [preauth]
Jan 26 09:20:14 compute-1 sshd-session[30841]: Connection closed by authenticating user root 104.248.198.71 port 44792 [preauth]
Jan 26 09:20:17 compute-1 sshd-session[30843]: Connection closed by authenticating user root 178.62.249.31 port 40264 [preauth]
Jan 26 09:21:03 compute-1 sshd-session[30846]: Connection closed by authenticating user root 152.42.136.110 port 45354 [preauth]
Jan 26 09:21:04 compute-1 sshd-session[30848]: Connection closed by authenticating user root 104.248.198.71 port 44524 [preauth]
Jan 26 09:21:09 compute-1 sshd-session[30850]: Connection closed by authenticating user root 178.62.249.31 port 55604 [preauth]
Jan 26 09:21:53 compute-1 sshd-session[30852]: Connection closed by authenticating user root 152.42.136.110 port 44368 [preauth]
Jan 26 09:21:54 compute-1 sshd-session[30854]: Connection closed by authenticating user root 104.248.198.71 port 48420 [preauth]
Jan 26 09:21:59 compute-1 sshd-session[30856]: Connection closed by authenticating user root 178.62.249.31 port 39344 [preauth]
Jan 26 09:22:38 compute-1 sshd-session[30859]: Connection closed by authenticating user root 152.42.136.110 port 42318 [preauth]
Jan 26 09:22:43 compute-1 sshd-session[30861]: Connection closed by authenticating user root 104.248.198.71 port 33514 [preauth]
Jan 26 09:22:49 compute-1 sshd-session[30863]: Connection closed by authenticating user root 178.62.249.31 port 49398 [preauth]
Jan 26 09:23:22 compute-1 sshd-session[29929]: Received disconnect from 38.102.83.222 port 47946:11: disconnected by user
Jan 26 09:23:22 compute-1 sshd-session[29929]: Disconnected from user zuul 38.102.83.222 port 47946
Jan 26 09:23:22 compute-1 sshd-session[29926]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:23:22 compute-1 systemd[1]: session-8.scope: Deactivated successfully.
Jan 26 09:23:22 compute-1 systemd[1]: session-8.scope: Consumed 4.856s CPU time.
Jan 26 09:23:22 compute-1 systemd-logind[783]: Session 8 logged out. Waiting for processes to exit.
Jan 26 09:23:22 compute-1 systemd-logind[783]: Removed session 8.
Jan 26 09:23:24 compute-1 sshd-session[30866]: Connection closed by authenticating user root 152.42.136.110 port 40184 [preauth]
Jan 26 09:23:33 compute-1 sshd-session[30868]: Connection closed by authenticating user root 104.248.198.71 port 50130 [preauth]
Jan 26 09:23:37 compute-1 sshd-session[30870]: Connection closed by authenticating user root 178.62.249.31 port 54062 [preauth]
Jan 26 09:24:08 compute-1 sshd-session[30872]: Connection closed by authenticating user root 152.42.136.110 port 45804 [preauth]
Jan 26 09:24:22 compute-1 sshd-session[30874]: Connection closed by authenticating user root 104.248.198.71 port 45798 [preauth]
Jan 26 09:24:24 compute-1 sshd-session[30876]: Connection closed by authenticating user root 178.62.249.31 port 51902 [preauth]
Jan 26 09:24:55 compute-1 sshd-session[30878]: Connection closed by authenticating user root 152.42.136.110 port 46968 [preauth]
Jan 26 09:25:11 compute-1 sshd-session[30880]: Connection closed by authenticating user root 178.62.249.31 port 36538 [preauth]
Jan 26 09:25:12 compute-1 sshd-session[30882]: Connection closed by authenticating user root 104.248.198.71 port 47842 [preauth]
Jan 26 09:25:40 compute-1 sshd-session[30884]: Connection closed by authenticating user root 152.42.136.110 port 51454 [preauth]
Jan 26 09:25:57 compute-1 sshd-session[30886]: Connection closed by authenticating user root 178.62.249.31 port 60218 [preauth]
Jan 26 09:26:00 compute-1 sshd-session[30888]: Connection closed by authenticating user root 104.248.198.71 port 52216 [preauth]
Jan 26 09:26:26 compute-1 sshd-session[30893]: Connection closed by authenticating user root 152.42.136.110 port 60216 [preauth]
Jan 26 09:26:43 compute-1 sshd-session[30896]: Connection closed by authenticating user root 178.62.249.31 port 57422 [preauth]
Jan 26 09:26:47 compute-1 sshd-session[30898]: Connection closed by authenticating user root 104.248.198.71 port 55436 [preauth]
Jan 26 09:27:10 compute-1 sshd-session[30900]: Connection closed by authenticating user root 152.42.136.110 port 59574 [preauth]
Jan 26 09:27:27 compute-1 sshd-session[30903]: Connection closed by authenticating user root 178.62.249.31 port 49580 [preauth]
Jan 26 09:27:35 compute-1 sshd-session[30905]: Connection closed by authenticating user root 104.248.198.71 port 44286 [preauth]
Jan 26 09:27:55 compute-1 sshd-session[30907]: Connection closed by authenticating user root 152.42.136.110 port 34154 [preauth]
Jan 26 09:28:12 compute-1 sshd-session[30909]: Connection closed by authenticating user root 178.62.249.31 port 52496 [preauth]
Jan 26 09:28:20 compute-1 sshd-session[30911]: Connection closed by authenticating user root 104.248.198.71 port 34722 [preauth]
Jan 26 09:28:40 compute-1 sshd-session[30913]: Connection closed by authenticating user root 152.42.136.110 port 54490 [preauth]
Jan 26 09:28:56 compute-1 sshd-session[30916]: Connection closed by authenticating user root 178.62.249.31 port 48988 [preauth]
Jan 26 09:29:09 compute-1 sshd-session[30918]: Connection closed by authenticating user root 104.248.198.71 port 54736 [preauth]
Jan 26 09:29:25 compute-1 sshd-session[30921]: Connection closed by authenticating user root 152.42.136.110 port 38418 [preauth]
Jan 26 09:29:30 compute-1 sshd-session[30923]: Accepted publickey for zuul from 192.168.122.30 port 48054 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:29:30 compute-1 systemd-logind[783]: New session 9 of user zuul.
Jan 26 09:29:30 compute-1 systemd[1]: Started Session 9 of User zuul.
Jan 26 09:29:30 compute-1 sshd-session[30923]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:29:32 compute-1 python3.9[31076]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:29:33 compute-1 sudo[31255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocuhzhsiszmpghngrpggthrtudhcinhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419772.7009115-52-183939666596762/AnsiballZ_command.py'
Jan 26 09:29:33 compute-1 sudo[31255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:29:33 compute-1 python3.9[31257]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:29:40 compute-1 sudo[31255]: pam_unix(sudo:session): session closed for user root
Jan 26 09:29:43 compute-1 sshd-session[31314]: Connection closed by authenticating user root 178.62.249.31 port 41970 [preauth]
Jan 26 09:29:43 compute-1 sshd-session[30926]: Connection closed by 192.168.122.30 port 48054
Jan 26 09:29:43 compute-1 sshd-session[30923]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:29:43 compute-1 systemd[1]: session-9.scope: Deactivated successfully.
Jan 26 09:29:43 compute-1 systemd[1]: session-9.scope: Consumed 7.655s CPU time.
Jan 26 09:29:43 compute-1 systemd-logind[783]: Session 9 logged out. Waiting for processes to exit.
Jan 26 09:29:43 compute-1 systemd-logind[783]: Removed session 9.
Jan 26 09:29:58 compute-1 sshd-session[31316]: Connection closed by authenticating user root 104.248.198.71 port 35140 [preauth]
Jan 26 09:29:59 compute-1 sshd-session[31318]: Accepted publickey for zuul from 192.168.122.30 port 60204 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:29:59 compute-1 systemd-logind[783]: New session 10 of user zuul.
Jan 26 09:29:59 compute-1 systemd[1]: Started Session 10 of User zuul.
Jan 26 09:29:59 compute-1 sshd-session[31318]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:29:59 compute-1 python3.9[31471]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 26 09:30:01 compute-1 python3.9[31645]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:30:02 compute-1 sudo[31795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjvoimaohuirjpbwkzqdppizpkvkfqgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419801.8055634-89-28905435014686/AnsiballZ_command.py'
Jan 26 09:30:02 compute-1 sudo[31795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:30:02 compute-1 python3.9[31797]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:30:02 compute-1 sudo[31795]: pam_unix(sudo:session): session closed for user root
Jan 26 09:30:03 compute-1 sudo[31948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uouqjgcngfzjkejtihyrgnobwinjtjmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419802.833034-125-157761689297200/AnsiballZ_stat.py'
Jan 26 09:30:03 compute-1 sudo[31948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:30:03 compute-1 python3.9[31950]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:30:03 compute-1 sudo[31948]: pam_unix(sudo:session): session closed for user root
Jan 26 09:30:04 compute-1 sudo[32100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liokqrllylsmdauaqjkxcdyurqzinpnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419803.7123382-149-20877136448147/AnsiballZ_file.py'
Jan 26 09:30:04 compute-1 sudo[32100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:30:04 compute-1 python3.9[32102]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:30:04 compute-1 sudo[32100]: pam_unix(sudo:session): session closed for user root
Jan 26 09:30:04 compute-1 sudo[32252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jacmqtdlasnrgdxraoaftulccgdbcuiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419804.55097-173-203606680384125/AnsiballZ_stat.py'
Jan 26 09:30:04 compute-1 sudo[32252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:30:05 compute-1 python3.9[32254]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:30:05 compute-1 sudo[32252]: pam_unix(sudo:session): session closed for user root
Jan 26 09:30:05 compute-1 sudo[32375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcawurwukqolklgfxcxmtudsbjrnamil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419804.55097-173-203606680384125/AnsiballZ_copy.py'
Jan 26 09:30:05 compute-1 sudo[32375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:30:05 compute-1 python3.9[32377]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769419804.55097-173-203606680384125/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:30:05 compute-1 sudo[32375]: pam_unix(sudo:session): session closed for user root
Jan 26 09:30:06 compute-1 sudo[32527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeeewhfbgfohacupaubpmqczdwcnczff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419805.8279765-218-281137975045914/AnsiballZ_setup.py'
Jan 26 09:30:06 compute-1 sudo[32527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:30:06 compute-1 python3.9[32529]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:30:06 compute-1 sudo[32527]: pam_unix(sudo:session): session closed for user root
Jan 26 09:30:06 compute-1 sudo[32683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njwodcwyiqchilrirdnhwqszwsdfpbvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419806.7252939-242-279920916671146/AnsiballZ_file.py'
Jan 26 09:30:06 compute-1 sudo[32683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:30:07 compute-1 python3.9[32685]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:30:07 compute-1 sudo[32683]: pam_unix(sudo:session): session closed for user root
Jan 26 09:30:07 compute-1 sudo[32835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvqrhqyngxtpgczqgqlidmsfkzbrvozp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419807.3974175-269-159431770376300/AnsiballZ_file.py'
Jan 26 09:30:07 compute-1 sudo[32835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:30:07 compute-1 python3.9[32837]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:30:07 compute-1 sudo[32835]: pam_unix(sudo:session): session closed for user root
Jan 26 09:30:08 compute-1 python3.9[32987]: ansible-ansible.builtin.service_facts Invoked
Jan 26 09:30:09 compute-1 sshd-session[32988]: Connection closed by authenticating user root 152.42.136.110 port 52694 [preauth]
Jan 26 09:30:16 compute-1 python3.9[33244]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:30:16 compute-1 python3.9[33394]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:30:18 compute-1 python3.9[33548]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:30:18 compute-1 sshd-session[33117]: error: maximum authentication attempts exceeded for root from 211.196.22.164 port 36658 ssh2 [preauth]
Jan 26 09:30:18 compute-1 sshd-session[33117]: Disconnecting authenticating user root 211.196.22.164 port 36658: Too many authentication failures [preauth]
Jan 26 09:30:18 compute-1 sudo[33704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyhpsjptyflprdycqbulmutrpqwjtpsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419818.601333-413-182504620142687/AnsiballZ_setup.py'
Jan 26 09:30:18 compute-1 sudo[33704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:30:19 compute-1 python3.9[33706]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 09:30:19 compute-1 sudo[33704]: pam_unix(sudo:session): session closed for user root
Jan 26 09:30:19 compute-1 sudo[33790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-astieyrykxdprpnoilppuwzyfvxxifey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419818.601333-413-182504620142687/AnsiballZ_dnf.py'
Jan 26 09:30:19 compute-1 sudo[33790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:30:20 compute-1 python3.9[33792]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:30:23 compute-1 sshd-session[33707]: error: maximum authentication attempts exceeded for root from 211.196.22.164 port 36926 ssh2 [preauth]
Jan 26 09:30:23 compute-1 sshd-session[33707]: Disconnecting authenticating user root 211.196.22.164 port 36926: Too many authentication failures [preauth]
Jan 26 09:30:28 compute-1 sshd-session[33856]: error: maximum authentication attempts exceeded for root from 211.196.22.164 port 37190 ssh2 [preauth]
Jan 26 09:30:28 compute-1 sshd-session[33856]: Disconnecting authenticating user root 211.196.22.164 port 37190: Too many authentication failures [preauth]
Jan 26 09:30:29 compute-1 sshd-session[33863]: Connection closed by authenticating user root 178.62.249.31 port 57464 [preauth]
Jan 26 09:30:32 compute-1 sshd-session[33865]: Received disconnect from 211.196.22.164 port 37448:11: disconnected by user [preauth]
Jan 26 09:30:32 compute-1 sshd-session[33865]: Disconnected from authenticating user root 211.196.22.164 port 37448 [preauth]
Jan 26 09:30:36 compute-1 sshd-session[33879]: Invalid user admin from 211.196.22.164 port 37762
Jan 26 09:30:37 compute-1 sshd-session[33879]: error: maximum authentication attempts exceeded for invalid user admin from 211.196.22.164 port 37762 ssh2 [preauth]
Jan 26 09:30:37 compute-1 sshd-session[33879]: Disconnecting invalid user admin 211.196.22.164 port 37762: Too many authentication failures [preauth]
Jan 26 09:30:45 compute-1 sshd-session[33916]: Invalid user admin from 211.196.22.164 port 38006
Jan 26 09:30:46 compute-1 sshd-session[33916]: error: maximum authentication attempts exceeded for invalid user admin from 211.196.22.164 port 38006 ssh2 [preauth]
Jan 26 09:30:46 compute-1 sshd-session[33916]: Disconnecting invalid user admin 211.196.22.164 port 38006: Too many authentication failures [preauth]
Jan 26 09:30:46 compute-1 sshd-session[33947]: Connection closed by 206.189.130.90 port 58208
Jan 26 09:30:49 compute-1 sshd-session[33950]: Connection closed by authenticating user root 104.248.198.71 port 34796 [preauth]
Jan 26 09:30:51 compute-1 sshd-session[33948]: Invalid user admin from 211.196.22.164 port 38488
Jan 26 09:30:51 compute-1 sshd-session[33948]: Received disconnect from 211.196.22.164 port 38488:11: disconnected by user [preauth]
Jan 26 09:30:51 compute-1 sshd-session[33948]: Disconnected from invalid user admin 211.196.22.164 port 38488 [preauth]
Jan 26 09:30:55 compute-1 sshd-session[33952]: Invalid user oracle from 211.196.22.164 port 38818
Jan 26 09:30:55 compute-1 sshd-session[33954]: Connection closed by authenticating user root 152.42.136.110 port 39236 [preauth]
Jan 26 09:30:56 compute-1 sshd-session[33952]: error: maximum authentication attempts exceeded for invalid user oracle from 211.196.22.164 port 38818 ssh2 [preauth]
Jan 26 09:30:56 compute-1 sshd-session[33952]: Disconnecting invalid user oracle 211.196.22.164 port 38818: Too many authentication failures [preauth]
Jan 26 09:31:00 compute-1 sshd-session[33956]: Invalid user oracle from 211.196.22.164 port 39090
Jan 26 09:31:01 compute-1 sshd-session[33956]: error: maximum authentication attempts exceeded for invalid user oracle from 211.196.22.164 port 39090 ssh2 [preauth]
Jan 26 09:31:01 compute-1 sshd-session[33956]: Disconnecting invalid user oracle 211.196.22.164 port 39090: Too many authentication failures [preauth]
Jan 26 09:31:03 compute-1 systemd[1]: Reloading.
Jan 26 09:31:03 compute-1 systemd-rc-local-generator[34012]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:31:03 compute-1 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 26 09:31:04 compute-1 sshd-session[33965]: Invalid user oracle from 211.196.22.164 port 39344
Jan 26 09:31:04 compute-1 sshd-session[33965]: Received disconnect from 211.196.22.164 port 39344:11: disconnected by user [preauth]
Jan 26 09:31:04 compute-1 sshd-session[33965]: Disconnected from invalid user oracle 211.196.22.164 port 39344 [preauth]
Jan 26 09:31:04 compute-1 systemd[1]: Reloading.
Jan 26 09:31:04 compute-1 systemd-rc-local-generator[34059]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:31:04 compute-1 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 26 09:31:04 compute-1 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 26 09:31:04 compute-1 systemd[1]: Reloading.
Jan 26 09:31:04 compute-1 systemd-rc-local-generator[34102]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:31:04 compute-1 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 26 09:31:05 compute-1 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Jan 26 09:31:05 compute-1 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Jan 26 09:31:05 compute-1 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Jan 26 09:31:05 compute-1 sshd-session[33963]: Connection closed by authenticating user root 206.189.130.90 port 42730 [preauth]
Jan 26 09:31:07 compute-1 sshd-session[34070]: Invalid user usuario from 211.196.22.164 port 39564
Jan 26 09:31:08 compute-1 sshd-session[34070]: error: maximum authentication attempts exceeded for invalid user usuario from 211.196.22.164 port 39564 ssh2 [preauth]
Jan 26 09:31:08 compute-1 sshd-session[34070]: Disconnecting invalid user usuario 211.196.22.164 port 39564: Too many authentication failures [preauth]
Jan 26 09:31:16 compute-1 sshd-session[34131]: Invalid user usuario from 211.196.22.164 port 39866
Jan 26 09:31:17 compute-1 sshd-session[34131]: error: maximum authentication attempts exceeded for invalid user usuario from 211.196.22.164 port 39866 ssh2 [preauth]
Jan 26 09:31:17 compute-1 sshd-session[34131]: Disconnecting invalid user usuario 211.196.22.164 port 39866: Too many authentication failures [preauth]
Jan 26 09:31:17 compute-1 sshd-session[34153]: Connection closed by authenticating user root 178.62.249.31 port 42662 [preauth]
Jan 26 09:31:21 compute-1 sshd-session[34161]: Invalid user usuario from 211.196.22.164 port 40316
Jan 26 09:31:22 compute-1 sshd-session[34161]: Received disconnect from 211.196.22.164 port 40316:11: disconnected by user [preauth]
Jan 26 09:31:22 compute-1 sshd-session[34161]: Disconnected from invalid user usuario 211.196.22.164 port 40316 [preauth]
Jan 26 09:31:24 compute-1 sshd-session[34178]: Invalid user test from 211.196.22.164 port 40556
Jan 26 09:31:25 compute-1 sshd-session[34178]: error: maximum authentication attempts exceeded for invalid user test from 211.196.22.164 port 40556 ssh2 [preauth]
Jan 26 09:31:25 compute-1 sshd-session[34178]: Disconnecting invalid user test 211.196.22.164 port 40556: Too many authentication failures [preauth]
Jan 26 09:31:27 compute-1 sshd-session[34202]: Invalid user test from 211.196.22.164 port 40848
Jan 26 09:31:28 compute-1 sshd-session[34202]: error: maximum authentication attempts exceeded for invalid user test from 211.196.22.164 port 40848 ssh2 [preauth]
Jan 26 09:31:28 compute-1 sshd-session[34202]: Disconnecting invalid user test 211.196.22.164 port 40848: Too many authentication failures [preauth]
Jan 26 09:31:30 compute-1 sshd-session[34216]: Invalid user test from 211.196.22.164 port 41114
Jan 26 09:31:31 compute-1 sshd-session[34216]: Received disconnect from 211.196.22.164 port 41114:11: disconnected by user [preauth]
Jan 26 09:31:31 compute-1 sshd-session[34216]: Disconnected from invalid user test 211.196.22.164 port 41114 [preauth]
Jan 26 09:31:40 compute-1 sshd-session[34234]: Invalid user user from 211.196.22.164 port 41286
Jan 26 09:31:40 compute-1 sshd-session[34293]: Connection closed by authenticating user root 152.42.136.110 port 57960 [preauth]
Jan 26 09:31:41 compute-1 sshd-session[34295]: Connection closed by authenticating user root 104.248.198.71 port 53784 [preauth]
Jan 26 09:31:41 compute-1 sshd-session[34234]: error: maximum authentication attempts exceeded for invalid user user from 211.196.22.164 port 41286 ssh2 [preauth]
Jan 26 09:31:41 compute-1 sshd-session[34234]: Disconnecting invalid user user 211.196.22.164 port 41286: Too many authentication failures [preauth]
Jan 26 09:31:51 compute-1 sshd-session[34302]: Connection closed by 211.196.22.164 port 41874 [preauth]
Jan 26 09:32:05 compute-1 sshd-session[34358]: Connection closed by authenticating user root 178.62.249.31 port 41166 [preauth]
Jan 26 09:32:12 compute-1 kernel: SELinux:  Converting 2723 SID table entries...
Jan 26 09:32:12 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 09:32:12 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 26 09:32:12 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 09:32:12 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 26 09:32:12 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 09:32:12 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 09:32:12 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 09:32:13 compute-1 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 26 09:32:13 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 09:32:13 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 26 09:32:13 compute-1 systemd[1]: Reloading.
Jan 26 09:32:13 compute-1 systemd-rc-local-generator[34470]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:32:13 compute-1 systemd[1]: Starting dnf makecache...
Jan 26 09:32:13 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 09:32:13 compute-1 dnf[34493]: Failed determining last makecache time.
Jan 26 09:32:13 compute-1 dnf[34493]: delorean-openstack-barbican-42b4c41831408a8e323 110 kB/s | 3.0 kB     00:00
Jan 26 09:32:13 compute-1 dnf[34493]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 189 kB/s | 3.0 kB     00:00
Jan 26 09:32:13 compute-1 dnf[34493]: delorean-openstack-cinder-1c00d6490d88e436f26ef 181 kB/s | 3.0 kB     00:00
Jan 26 09:32:13 compute-1 dnf[34493]: delorean-python-stevedore-c4acc5639fd2329372142 170 kB/s | 3.0 kB     00:00
Jan 26 09:32:13 compute-1 dnf[34493]: delorean-python-cloudkitty-tests-tempest-2c80f8 179 kB/s | 3.0 kB     00:00
Jan 26 09:32:13 compute-1 dnf[34493]: delorean-os-refresh-config-9bfc52b5049be2d8de61 159 kB/s | 3.0 kB     00:00
Jan 26 09:32:13 compute-1 dnf[34493]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 173 kB/s | 3.0 kB     00:00
Jan 26 09:32:13 compute-1 sudo[33790]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:13 compute-1 dnf[34493]: delorean-python-designate-tests-tempest-347fdbc 137 kB/s | 3.0 kB     00:00
Jan 26 09:32:14 compute-1 dnf[34493]: delorean-openstack-glance-1fd12c29b339f30fe823e 149 kB/s | 3.0 kB     00:00
Jan 26 09:32:14 compute-1 dnf[34493]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 180 kB/s | 3.0 kB     00:00
Jan 26 09:32:14 compute-1 dnf[34493]: delorean-openstack-manila-3c01b7181572c95dac462 181 kB/s | 3.0 kB     00:00
Jan 26 09:32:14 compute-1 dnf[34493]: delorean-python-whitebox-neutron-tests-tempest- 176 kB/s | 3.0 kB     00:00
Jan 26 09:32:14 compute-1 dnf[34493]: delorean-openstack-octavia-ba397f07a7331190208c 140 kB/s | 3.0 kB     00:00
Jan 26 09:32:14 compute-1 dnf[34493]: delorean-openstack-watcher-c014f81a8647287f6dcc 158 kB/s | 3.0 kB     00:00
Jan 26 09:32:14 compute-1 dnf[34493]: delorean-ansible-config_template-5ccaa22121a7ff 181 kB/s | 3.0 kB     00:00
Jan 26 09:32:14 compute-1 dnf[34493]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 177 kB/s | 3.0 kB     00:00
Jan 26 09:32:14 compute-1 dnf[34493]: delorean-openstack-swift-dc98a8463506ac520c469a 163 kB/s | 3.0 kB     00:00
Jan 26 09:32:14 compute-1 dnf[34493]: delorean-python-tempestconf-8515371b7cceebd4282 134 kB/s | 3.0 kB     00:00
Jan 26 09:32:14 compute-1 dnf[34493]: delorean-openstack-heat-ui-013accbfd179753bc3f0 140 kB/s | 3.0 kB     00:00
Jan 26 09:32:14 compute-1 dnf[34493]: CentOS Stream 9 - BaseOS                         61 kB/s | 6.7 kB     00:00
Jan 26 09:32:14 compute-1 sudo[35407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dftufnfjzzrsnflaybdekrcgzlxrxtzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419934.2264469-451-95970965403025/AnsiballZ_command.py'
Jan 26 09:32:14 compute-1 sudo[35407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:14 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 09:32:14 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 09:32:14 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.198s CPU time.
Jan 26 09:32:14 compute-1 systemd[1]: run-rb9c92ec70e5d4f1e8cf6b98243a9f807.service: Deactivated successfully.
Jan 26 09:32:14 compute-1 dnf[34493]: CentOS Stream 9 - AppStream                      29 kB/s | 6.8 kB     00:00
Jan 26 09:32:14 compute-1 python3.9[35409]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:32:14 compute-1 dnf[34493]: CentOS Stream 9 - CRB                            58 kB/s | 6.6 kB     00:00
Jan 26 09:32:15 compute-1 dnf[34493]: CentOS Stream 9 - Extras packages                64 kB/s | 7.3 kB     00:00
Jan 26 09:32:15 compute-1 dnf[34493]: dlrn-antelope-testing                           108 kB/s | 3.0 kB     00:00
Jan 26 09:32:15 compute-1 dnf[34493]: dlrn-antelope-build-deps                        149 kB/s | 3.0 kB     00:00
Jan 26 09:32:15 compute-1 dnf[34493]: centos9-rabbitmq                                 35 kB/s | 3.0 kB     00:00
Jan 26 09:32:15 compute-1 dnf[34493]: centos9-storage                                 106 kB/s | 3.0 kB     00:00
Jan 26 09:32:15 compute-1 dnf[34493]: centos9-opstools                                121 kB/s | 3.0 kB     00:00
Jan 26 09:32:15 compute-1 dnf[34493]: NFV SIG OpenvSwitch                             115 kB/s | 3.0 kB     00:00
Jan 26 09:32:15 compute-1 dnf[34493]: repo-setup-centos-appstream                     113 kB/s | 4.4 kB     00:00
Jan 26 09:32:15 compute-1 dnf[34493]: repo-setup-centos-baseos                         95 kB/s | 3.9 kB     00:00
Jan 26 09:32:15 compute-1 sudo[35407]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:15 compute-1 dnf[34493]: repo-setup-centos-highavailability              142 kB/s | 3.9 kB     00:00
Jan 26 09:32:15 compute-1 dnf[34493]: repo-setup-centos-powertools                    204 kB/s | 4.3 kB     00:00
Jan 26 09:32:15 compute-1 dnf[34493]: Extra Packages for Enterprise Linux 9 - x86_64  247 kB/s |  31 kB     00:00
Jan 26 09:32:16 compute-1 sudo[35710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkwcmmgjpsiqmexzdrxpnylnebqwzxyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419935.691481-473-157692211964236/AnsiballZ_selinux.py'
Jan 26 09:32:16 compute-1 sudo[35710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:16 compute-1 dnf[34493]: Metadata cache created.
Jan 26 09:32:16 compute-1 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 26 09:32:16 compute-1 systemd[1]: Finished dnf makecache.
Jan 26 09:32:16 compute-1 systemd[1]: dnf-makecache.service: Consumed 1.942s CPU time.
Jan 26 09:32:16 compute-1 python3.9[35712]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 26 09:32:16 compute-1 sudo[35710]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:17 compute-1 sudo[35863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzzvprnrzdwtvdxbxxagcsziweuxisbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419937.0471795-506-174323436471339/AnsiballZ_command.py'
Jan 26 09:32:17 compute-1 sudo[35863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:17 compute-1 python3.9[35865]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 26 09:32:19 compute-1 sudo[35863]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:19 compute-1 sudo[36016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrbarxrllgjtzithknmmrhigvjctdaxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419939.6154826-530-264758789048644/AnsiballZ_file.py'
Jan 26 09:32:19 compute-1 sudo[36016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:20 compute-1 python3.9[36018]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:32:20 compute-1 sudo[36016]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:22 compute-1 sudo[36168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgwwjhpfemyvdshozpqgygnlzrpfzloi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419942.5076888-554-52756046055808/AnsiballZ_mount.py'
Jan 26 09:32:22 compute-1 sudo[36168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:23 compute-1 python3.9[36170]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 26 09:32:23 compute-1 sudo[36168]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:24 compute-1 sudo[36320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyqdvxkqomaserkoxmlartbzjpzbuutg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419944.2079458-638-128565241718781/AnsiballZ_file.py'
Jan 26 09:32:24 compute-1 sudo[36320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:24 compute-1 python3.9[36322]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:32:24 compute-1 sudo[36320]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:26 compute-1 sshd-session[36347]: Connection closed by authenticating user root 152.42.136.110 port 60988 [preauth]
Jan 26 09:32:27 compute-1 sudo[36474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rprmmxgnxfgvfqosjyipwuqopuldrcrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419947.4861498-662-124391614481009/AnsiballZ_stat.py'
Jan 26 09:32:27 compute-1 sudo[36474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:27 compute-1 python3.9[36476]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:32:27 compute-1 sudo[36474]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:28 compute-1 sudo[36597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mheexrncbqykcqodwylcyorywksfmfbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419947.4861498-662-124391614481009/AnsiballZ_copy.py'
Jan 26 09:32:28 compute-1 sudo[36597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:31 compute-1 python3.9[36599]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769419947.4861498-662-124391614481009/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a4f71bf0609e75a0e091c9100076ae4c4a7bed4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:32:31 compute-1 sudo[36597]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:31 compute-1 sshd-session[36600]: Connection closed by authenticating user root 104.248.198.71 port 38036 [preauth]
Jan 26 09:32:32 compute-1 sudo[36751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohukbhsmzhmidldvhnvyslemiibkrwvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419951.8719292-734-251722759477748/AnsiballZ_stat.py'
Jan 26 09:32:32 compute-1 sudo[36751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:32 compute-1 python3.9[36753]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:32:32 compute-1 sudo[36751]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:32 compute-1 sudo[36903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msznlcimenktgowubfnrfvxrktgbcbsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419952.5772028-758-25933267431608/AnsiballZ_command.py'
Jan 26 09:32:32 compute-1 sudo[36903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:34 compute-1 python3.9[36905]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:32:34 compute-1 sudo[36903]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:34 compute-1 sudo[37056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrghmhzbuaiukxenfjkmmaifscxjfqdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419954.4985554-782-126988690237481/AnsiballZ_file.py'
Jan 26 09:32:34 compute-1 sudo[37056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:34 compute-1 python3.9[37058]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:32:34 compute-1 sudo[37056]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:35 compute-1 sudo[37208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iksqicqrtjaziahugxqtxjraqaqobjxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419955.4811988-815-279148734787255/AnsiballZ_getent.py'
Jan 26 09:32:35 compute-1 sudo[37208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:36 compute-1 python3.9[37210]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 26 09:32:36 compute-1 sudo[37208]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:36 compute-1 sudo[37361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuknuugzlwpddiybtndcifolzogubeqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419956.3235226-839-169149346294597/AnsiballZ_group.py'
Jan 26 09:32:36 compute-1 sudo[37361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:36 compute-1 python3.9[37363]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 09:32:37 compute-1 groupadd[37364]: group added to /etc/group: name=qemu, GID=107
Jan 26 09:32:37 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 09:32:37 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 09:32:37 compute-1 groupadd[37364]: group added to /etc/gshadow: name=qemu
Jan 26 09:32:37 compute-1 groupadd[37364]: new group: name=qemu, GID=107
Jan 26 09:32:37 compute-1 sudo[37361]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:38 compute-1 sudo[37520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huvbkscijgxejslpepvmcrxheiobrkfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419957.5260255-863-251111892022243/AnsiballZ_user.py'
Jan 26 09:32:38 compute-1 sudo[37520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:38 compute-1 python3.9[37522]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 09:32:38 compute-1 useradd[37524]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Jan 26 09:32:38 compute-1 sudo[37520]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:39 compute-1 sudo[37680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtehqlvfaagpbsslxiuywoxgdndznboz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419958.8477519-887-83911187424959/AnsiballZ_getent.py'
Jan 26 09:32:39 compute-1 sudo[37680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:39 compute-1 python3.9[37682]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 26 09:32:39 compute-1 sudo[37680]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:39 compute-1 sudo[37833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ciftiwzfyjlkbkvbkesmnedhaqxfsklk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419959.527067-911-258506227245179/AnsiballZ_group.py'
Jan 26 09:32:39 compute-1 sudo[37833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:40 compute-1 python3.9[37835]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 09:32:40 compute-1 groupadd[37836]: group added to /etc/group: name=hugetlbfs, GID=42477
Jan 26 09:32:40 compute-1 groupadd[37836]: group added to /etc/gshadow: name=hugetlbfs
Jan 26 09:32:40 compute-1 groupadd[37836]: new group: name=hugetlbfs, GID=42477
Jan 26 09:32:40 compute-1 sudo[37833]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:40 compute-1 sudo[37991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jubsqdvtboajqcvcvnglqbdmjneplyxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419960.3981998-938-280171595684436/AnsiballZ_file.py'
Jan 26 09:32:40 compute-1 sudo[37991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:40 compute-1 python3.9[37993]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 26 09:32:40 compute-1 sudo[37991]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:41 compute-1 sudo[38143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnenjrbbwfwupyedfsvjjcbfzebvmqpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419961.386138-971-199634545932592/AnsiballZ_dnf.py'
Jan 26 09:32:41 compute-1 sudo[38143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:41 compute-1 python3.9[38145]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:32:44 compute-1 sudo[38143]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:45 compute-1 sudo[38296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtgfwxxilesidfipaxxwkahrtbdzwmax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419964.9607186-995-6577921144068/AnsiballZ_file.py'
Jan 26 09:32:45 compute-1 sudo[38296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:45 compute-1 python3.9[38298]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:32:45 compute-1 sudo[38296]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:45 compute-1 sudo[38448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjickzumqxmdfvlqmptgtbnujuuqaell ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419965.6798482-1019-60321278574533/AnsiballZ_stat.py'
Jan 26 09:32:45 compute-1 sudo[38448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:46 compute-1 python3.9[38450]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:32:46 compute-1 sudo[38448]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:46 compute-1 sudo[38571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sftgnludmqbowuwrxstlmpxwkijaplad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419965.6798482-1019-60321278574533/AnsiballZ_copy.py'
Jan 26 09:32:46 compute-1 sudo[38571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:46 compute-1 python3.9[38573]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769419965.6798482-1019-60321278574533/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:32:46 compute-1 sudo[38571]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:47 compute-1 sudo[38723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbnmmpazprnohgrpixofuebzxzpoqrls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419966.9070232-1064-71816645567083/AnsiballZ_systemd.py'
Jan 26 09:32:47 compute-1 sudo[38723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:47 compute-1 python3.9[38725]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 09:32:47 compute-1 systemd[1]: Starting Load Kernel Modules...
Jan 26 09:32:47 compute-1 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 26 09:32:47 compute-1 kernel: Bridge firewalling registered
Jan 26 09:32:47 compute-1 systemd-modules-load[38729]: Inserted module 'br_netfilter'
Jan 26 09:32:47 compute-1 systemd[1]: Finished Load Kernel Modules.
Jan 26 09:32:47 compute-1 sudo[38723]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:48 compute-1 sudo[38883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-moypjdglfijmpggcgksjumlisxcaktki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419968.1550276-1088-254231695047924/AnsiballZ_stat.py'
Jan 26 09:32:48 compute-1 sudo[38883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:48 compute-1 python3.9[38885]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:32:48 compute-1 sudo[38883]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:48 compute-1 sudo[39006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjfbbomerwyesipxjspohcrcvlxwnsju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419968.1550276-1088-254231695047924/AnsiballZ_copy.py'
Jan 26 09:32:48 compute-1 sudo[39006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:49 compute-1 python3.9[39008]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769419968.1550276-1088-254231695047924/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:32:49 compute-1 sudo[39006]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:49 compute-1 sudo[39158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmtsuygaggyrxjgseqrdrmblamzerdrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419969.62506-1142-264154593640056/AnsiballZ_dnf.py'
Jan 26 09:32:49 compute-1 sudo[39158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:50 compute-1 python3.9[39160]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:32:53 compute-1 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Jan 26 09:32:53 compute-1 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Jan 26 09:32:53 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 09:32:53 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 26 09:32:53 compute-1 systemd[1]: Reloading.
Jan 26 09:32:53 compute-1 systemd-rc-local-generator[39228]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:32:53 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 09:32:54 compute-1 sshd-session[39179]: Connection closed by authenticating user root 178.62.249.31 port 34566 [preauth]
Jan 26 09:32:54 compute-1 sudo[39158]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:55 compute-1 python3.9[40888]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:32:56 compute-1 python3.9[41800]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 26 09:32:57 compute-1 python3.9[42508]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:32:57 compute-1 sudo[43335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwqbfguyjsrgtyftlwspngsekmbjjghq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419977.4599423-1259-98307837535281/AnsiballZ_command.py'
Jan 26 09:32:57 compute-1 sudo[43335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:57 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 09:32:57 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 09:32:57 compute-1 systemd[1]: man-db-cache-update.service: Consumed 5.302s CPU time.
Jan 26 09:32:57 compute-1 systemd[1]: run-re983052085a14bd191ab4bb571acf307.service: Deactivated successfully.
Jan 26 09:32:57 compute-1 python3.9[43337]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:32:58 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 26 09:32:58 compute-1 systemd[1]: Starting Authorization Manager...
Jan 26 09:32:58 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 26 09:32:58 compute-1 polkitd[43555]: Started polkitd version 0.117
Jan 26 09:32:58 compute-1 polkitd[43555]: Loading rules from directory /etc/polkit-1/rules.d
Jan 26 09:32:58 compute-1 polkitd[43555]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 26 09:32:58 compute-1 polkitd[43555]: Finished loading, compiling and executing 2 rules
Jan 26 09:32:58 compute-1 systemd[1]: Started Authorization Manager.
Jan 26 09:32:58 compute-1 polkitd[43555]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Jan 26 09:32:58 compute-1 sudo[43335]: pam_unix(sudo:session): session closed for user root
Jan 26 09:32:59 compute-1 sudo[43723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koxozgqpaofbsdszsysrzunmwnjtudui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419978.8799195-1286-163052287251038/AnsiballZ_systemd.py'
Jan 26 09:32:59 compute-1 sudo[43723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:32:59 compute-1 python3.9[43725]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:32:59 compute-1 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 26 09:32:59 compute-1 systemd[1]: tuned.service: Deactivated successfully.
Jan 26 09:32:59 compute-1 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 26 09:32:59 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 26 09:32:59 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 26 09:32:59 compute-1 sudo[43723]: pam_unix(sudo:session): session closed for user root
Jan 26 09:33:00 compute-1 python3.9[43886]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 26 09:33:03 compute-1 sudo[44036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcjzazjdnnlbbfyrxntouhgajwdtphbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419983.3323405-1457-148155137908534/AnsiballZ_systemd.py'
Jan 26 09:33:03 compute-1 sudo[44036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:33:03 compute-1 python3.9[44038]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:33:04 compute-1 systemd[1]: Reloading.
Jan 26 09:33:04 compute-1 systemd-rc-local-generator[44064]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:33:04 compute-1 sudo[44036]: pam_unix(sudo:session): session closed for user root
Jan 26 09:33:04 compute-1 sudo[44226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdynvmbytmumrbnbxairxwzxuprxkvtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419984.4493046-1457-42271080325384/AnsiballZ_systemd.py'
Jan 26 09:33:04 compute-1 sudo[44226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:33:05 compute-1 python3.9[44228]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:33:05 compute-1 systemd[1]: Reloading.
Jan 26 09:33:05 compute-1 systemd-rc-local-generator[44258]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:33:05 compute-1 sudo[44226]: pam_unix(sudo:session): session closed for user root
Jan 26 09:33:05 compute-1 sudo[44415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbeanxeecjfjhhcqbetghzrqkwlkojup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419985.595187-1505-131479483760335/AnsiballZ_command.py'
Jan 26 09:33:05 compute-1 sudo[44415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:33:06 compute-1 python3.9[44417]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:33:06 compute-1 sudo[44415]: pam_unix(sudo:session): session closed for user root
Jan 26 09:33:06 compute-1 sudo[44568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qooxidekeoeoqautjehioomvzovprxop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419986.3688154-1529-205961851174030/AnsiballZ_command.py'
Jan 26 09:33:06 compute-1 sudo[44568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:33:06 compute-1 python3.9[44570]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:33:06 compute-1 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 26 09:33:06 compute-1 sudo[44568]: pam_unix(sudo:session): session closed for user root
Jan 26 09:33:07 compute-1 sudo[44721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enqwlwsyqmfidggozbjrzjwtfonvwfzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419987.1055808-1553-88549746756564/AnsiballZ_command.py'
Jan 26 09:33:07 compute-1 sudo[44721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:33:07 compute-1 python3.9[44723]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:33:09 compute-1 sudo[44721]: pam_unix(sudo:session): session closed for user root
Jan 26 09:33:09 compute-1 sudo[44883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yngnkutaeltsqmriflfmeoeceamiogrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419989.2994647-1577-70803563900213/AnsiballZ_command.py'
Jan 26 09:33:09 compute-1 sudo[44883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:33:09 compute-1 python3.9[44885]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:33:09 compute-1 sudo[44883]: pam_unix(sudo:session): session closed for user root
Jan 26 09:33:10 compute-1 sudo[45036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anfoieouxiqzlzesbhzmqjijbtihokdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419989.9644706-1601-3609892302987/AnsiballZ_systemd.py'
Jan 26 09:33:10 compute-1 sudo[45036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:33:10 compute-1 python3.9[45038]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 09:33:10 compute-1 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 26 09:33:10 compute-1 systemd[1]: Stopped Apply Kernel Variables.
Jan 26 09:33:10 compute-1 systemd[1]: Stopping Apply Kernel Variables...
Jan 26 09:33:10 compute-1 systemd[1]: Starting Apply Kernel Variables...
Jan 26 09:33:10 compute-1 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 26 09:33:10 compute-1 systemd[1]: Finished Apply Kernel Variables.
Jan 26 09:33:10 compute-1 sudo[45036]: pam_unix(sudo:session): session closed for user root
Jan 26 09:33:11 compute-1 sshd-session[31321]: Connection closed by 192.168.122.30 port 60204
Jan 26 09:33:11 compute-1 sshd-session[31318]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:33:11 compute-1 systemd[1]: session-10.scope: Deactivated successfully.
Jan 26 09:33:11 compute-1 systemd[1]: session-10.scope: Consumed 2min 21.532s CPU time.
Jan 26 09:33:11 compute-1 systemd-logind[783]: Session 10 logged out. Waiting for processes to exit.
Jan 26 09:33:11 compute-1 systemd-logind[783]: Removed session 10.
Jan 26 09:33:13 compute-1 sshd-session[45068]: Connection closed by authenticating user root 152.42.136.110 port 40916 [preauth]
Jan 26 09:33:16 compute-1 sshd-session[45070]: Accepted publickey for zuul from 192.168.122.30 port 46840 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:33:16 compute-1 systemd-logind[783]: New session 11 of user zuul.
Jan 26 09:33:16 compute-1 systemd[1]: Started Session 11 of User zuul.
Jan 26 09:33:16 compute-1 sshd-session[45070]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:33:17 compute-1 python3.9[45223]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:33:18 compute-1 sudo[45377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrbocruuxjdabcvjxucxqamaztzlthas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419998.1852672-64-30261872114465/AnsiballZ_getent.py'
Jan 26 09:33:18 compute-1 sudo[45377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:33:18 compute-1 python3.9[45379]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 26 09:33:18 compute-1 sudo[45377]: pam_unix(sudo:session): session closed for user root
Jan 26 09:33:19 compute-1 sudo[45530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfxpntntpxnlltcehlkrnevjkclnzopm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769419999.1750855-88-10157020691202/AnsiballZ_group.py'
Jan 26 09:33:19 compute-1 sudo[45530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:33:19 compute-1 python3.9[45532]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 09:33:19 compute-1 groupadd[45533]: group added to /etc/group: name=openvswitch, GID=42476
Jan 26 09:33:19 compute-1 groupadd[45533]: group added to /etc/gshadow: name=openvswitch
Jan 26 09:33:19 compute-1 groupadd[45533]: new group: name=openvswitch, GID=42476
Jan 26 09:33:19 compute-1 sudo[45530]: pam_unix(sudo:session): session closed for user root
Jan 26 09:33:20 compute-1 sudo[45688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyubdwnugcgikjcllbhsrtherweehlln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420000.131053-112-264574465976930/AnsiballZ_user.py'
Jan 26 09:33:20 compute-1 sudo[45688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:33:20 compute-1 python3.9[45690]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 09:33:20 compute-1 useradd[45692]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Jan 26 09:33:20 compute-1 useradd[45692]: add 'openvswitch' to group 'hugetlbfs'
Jan 26 09:33:20 compute-1 useradd[45692]: add 'openvswitch' to shadow group 'hugetlbfs'
Jan 26 09:33:20 compute-1 sudo[45688]: pam_unix(sudo:session): session closed for user root
Jan 26 09:33:21 compute-1 sudo[45848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpkumjkiaggtyjkbcudaopocansbjvil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420001.273468-142-153714279833193/AnsiballZ_setup.py'
Jan 26 09:33:21 compute-1 sudo[45848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:33:21 compute-1 python3.9[45850]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 09:33:22 compute-1 sudo[45848]: pam_unix(sudo:session): session closed for user root
Jan 26 09:33:22 compute-1 sshd-session[45851]: Connection closed by authenticating user root 104.248.198.71 port 44958 [preauth]
Jan 26 09:33:22 compute-1 sudo[45934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsjyaukrdkgqoeynzseuonnhwvpivhpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420001.273468-142-153714279833193/AnsiballZ_dnf.py'
Jan 26 09:33:22 compute-1 sudo[45934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:33:22 compute-1 python3.9[45936]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 09:33:24 compute-1 sudo[45934]: pam_unix(sudo:session): session closed for user root
Jan 26 09:33:25 compute-1 sudo[46098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejrrtomwijdcshawaagrbapzfjafvixq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420005.249123-184-135140849443742/AnsiballZ_dnf.py'
Jan 26 09:33:25 compute-1 sudo[46098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:33:25 compute-1 python3.9[46100]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:33:39 compute-1 kernel: SELinux:  Converting 2736 SID table entries...
Jan 26 09:33:39 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 09:33:39 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 26 09:33:39 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 09:33:39 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 26 09:33:39 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 09:33:39 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 09:33:39 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 09:33:39 compute-1 groupadd[46123]: group added to /etc/group: name=unbound, GID=994
Jan 26 09:33:39 compute-1 groupadd[46123]: group added to /etc/gshadow: name=unbound
Jan 26 09:33:39 compute-1 groupadd[46123]: new group: name=unbound, GID=994
Jan 26 09:33:39 compute-1 useradd[46130]: new user: name=unbound, UID=993, GID=994, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Jan 26 09:33:39 compute-1 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 26 09:33:39 compute-1 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 26 09:33:41 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 09:33:41 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 26 09:33:41 compute-1 systemd[1]: Reloading.
Jan 26 09:33:41 compute-1 systemd-rc-local-generator[46631]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:33:41 compute-1 systemd-sysv-generator[46635]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:33:41 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 09:33:42 compute-1 sudo[46098]: pam_unix(sudo:session): session closed for user root
Jan 26 09:33:42 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 09:33:42 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 09:33:42 compute-1 systemd[1]: run-r0fceec3173db4869931d59b965793161.service: Deactivated successfully.
Jan 26 09:33:42 compute-1 sshd-session[46548]: Connection closed by authenticating user root 178.62.249.31 port 45584 [preauth]
Jan 26 09:33:43 compute-1 sudo[47197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhwuuuixuvguyfwvvxyesaaxdpegrmyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420022.6006567-208-231169117996909/AnsiballZ_systemd.py'
Jan 26 09:33:43 compute-1 sudo[47197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:33:43 compute-1 python3.9[47199]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 09:33:43 compute-1 systemd[1]: Reloading.
Jan 26 09:33:43 compute-1 systemd-rc-local-generator[47230]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:33:43 compute-1 systemd-sysv-generator[47233]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:33:44 compute-1 systemd[1]: Starting Open vSwitch Database Unit...
Jan 26 09:33:44 compute-1 chown[47241]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 26 09:33:44 compute-1 ovs-ctl[47246]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 26 09:33:44 compute-1 ovs-ctl[47246]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 26 09:33:44 compute-1 ovs-ctl[47246]: Starting ovsdb-server [  OK  ]
Jan 26 09:33:44 compute-1 ovs-vsctl[47295]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 26 09:33:44 compute-1 ovs-vsctl[47315]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"5f259fb6-5896-4c89-8853-1dd537a2ebf7\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 26 09:33:44 compute-1 ovs-ctl[47246]: Configuring Open vSwitch system IDs [  OK  ]
Jan 26 09:33:44 compute-1 ovs-ctl[47246]: Enabling remote OVSDB managers [  OK  ]
Jan 26 09:33:44 compute-1 systemd[1]: Started Open vSwitch Database Unit.
Jan 26 09:33:44 compute-1 ovs-vsctl[47321]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Jan 26 09:33:44 compute-1 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 26 09:33:44 compute-1 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 26 09:33:44 compute-1 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 26 09:33:44 compute-1 kernel: openvswitch: Open vSwitch switching datapath
Jan 26 09:33:44 compute-1 ovs-ctl[47366]: Inserting openvswitch module [  OK  ]
Jan 26 09:33:44 compute-1 ovs-ctl[47334]: Starting ovs-vswitchd [  OK  ]
Jan 26 09:33:44 compute-1 ovs-ctl[47334]: Enabling remote OVSDB managers [  OK  ]
Jan 26 09:33:44 compute-1 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 26 09:33:44 compute-1 ovs-vsctl[47384]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Jan 26 09:33:44 compute-1 systemd[1]: Starting Open vSwitch...
Jan 26 09:33:44 compute-1 systemd[1]: Finished Open vSwitch.
Jan 26 09:33:44 compute-1 sudo[47197]: pam_unix(sudo:session): session closed for user root
Jan 26 09:33:45 compute-1 python3.9[47535]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:33:46 compute-1 sudo[47685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gecztlxhxhmneylbawfeheuglrkyeqok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420025.8344676-262-212026086142383/AnsiballZ_sefcontext.py'
Jan 26 09:33:46 compute-1 sudo[47685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:33:46 compute-1 python3.9[47687]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 26 09:33:47 compute-1 kernel: SELinux:  Converting 2750 SID table entries...
Jan 26 09:33:47 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 09:33:47 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 26 09:33:47 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 09:33:47 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 26 09:33:47 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 09:33:47 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 09:33:47 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 09:33:47 compute-1 sudo[47685]: pam_unix(sudo:session): session closed for user root
Jan 26 09:33:48 compute-1 python3.9[47842]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:33:49 compute-1 sudo[47998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzxivrxfccthmohurmqlzjhtufscdhvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420029.3365595-316-93394204864160/AnsiballZ_dnf.py'
Jan 26 09:33:49 compute-1 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 26 09:33:49 compute-1 sudo[47998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:33:49 compute-1 python3.9[48000]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:33:51 compute-1 sudo[47998]: pam_unix(sudo:session): session closed for user root
Jan 26 09:33:51 compute-1 sudo[48151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocpcbfircofvpjcqfhcgqgaesdwxapor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420031.5286796-340-144836247374842/AnsiballZ_command.py'
Jan 26 09:33:51 compute-1 sudo[48151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:33:52 compute-1 python3.9[48153]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:33:52 compute-1 sudo[48151]: pam_unix(sudo:session): session closed for user root
Jan 26 09:33:53 compute-1 sudo[48438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvttpjgpqoqamuoebknkkcyldnffymkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420033.1057956-364-111735768315877/AnsiballZ_file.py'
Jan 26 09:33:53 compute-1 sudo[48438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:33:53 compute-1 python3.9[48440]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 26 09:33:53 compute-1 sudo[48438]: pam_unix(sudo:session): session closed for user root
Jan 26 09:33:54 compute-1 python3.9[48592]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:33:55 compute-1 sudo[48744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxkmbbkvljhxqptmjhiqxrmajenjozjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420034.933055-412-22731464422054/AnsiballZ_dnf.py'
Jan 26 09:33:55 compute-1 sudo[48744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:33:55 compute-1 python3.9[48746]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:33:56 compute-1 sshd-session[48485]: Connection closed by authenticating user root 152.42.136.110 port 38726 [preauth]
Jan 26 09:33:59 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 09:33:59 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 26 09:33:59 compute-1 systemd[1]: Reloading.
Jan 26 09:33:59 compute-1 systemd-rc-local-generator[48784]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:33:59 compute-1 systemd-sysv-generator[48788]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:33:59 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 09:34:00 compute-1 sudo[48744]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:00 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 09:34:00 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 09:34:00 compute-1 systemd[1]: run-r803c64bf4b0f4d7e97c0d98a835bf6b5.service: Deactivated successfully.
Jan 26 09:34:01 compute-1 sudo[49062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujtuxgzwkbymdivfvylkrtxuofzolvtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420040.8574092-436-89366831035542/AnsiballZ_systemd.py'
Jan 26 09:34:01 compute-1 sudo[49062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:01 compute-1 python3.9[49064]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 09:34:01 compute-1 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 26 09:34:01 compute-1 systemd[1]: Stopped Network Manager Wait Online.
Jan 26 09:34:01 compute-1 systemd[1]: Stopping Network Manager Wait Online...
Jan 26 09:34:01 compute-1 systemd[1]: Stopping Network Manager...
Jan 26 09:34:01 compute-1 NetworkManager[7211]: <info>  [1769420041.5308] caught SIGTERM, shutting down normally.
Jan 26 09:34:01 compute-1 NetworkManager[7211]: <info>  [1769420041.5325] dhcp4 (eth0): canceled DHCP transaction
Jan 26 09:34:01 compute-1 NetworkManager[7211]: <info>  [1769420041.5325] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 09:34:01 compute-1 NetworkManager[7211]: <info>  [1769420041.5325] dhcp4 (eth0): state changed no lease
Jan 26 09:34:01 compute-1 NetworkManager[7211]: <info>  [1769420041.5328] manager: NetworkManager state is now CONNECTED_SITE
Jan 26 09:34:01 compute-1 NetworkManager[7211]: <info>  [1769420041.5442] exiting (success)
Jan 26 09:34:01 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 09:34:01 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 09:34:01 compute-1 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 26 09:34:01 compute-1 systemd[1]: Stopped Network Manager.
Jan 26 09:34:01 compute-1 systemd[1]: NetworkManager.service: Consumed 11.672s CPU time, 4.1M memory peak, read 0B from disk, written 39.0K to disk.
Jan 26 09:34:01 compute-1 systemd[1]: Starting Network Manager...
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.6070] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:ddaf00d4-1dc5-4d7f-b5ab-626f2ab79e8a)
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.6073] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.6139] manager[0x563bd18df000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 26 09:34:01 compute-1 systemd[1]: Starting Hostname Service...
Jan 26 09:34:01 compute-1 systemd[1]: Started Hostname Service.
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.6894] hostname: hostname: using hostnamed
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.6901] hostname: static hostname changed from (none) to "compute-1"
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.6908] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.6913] manager[0x563bd18df000]: rfkill: Wi-Fi hardware radio set enabled
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.6913] manager[0x563bd18df000]: rfkill: WWAN hardware radio set enabled
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.6939] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.6950] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.6951] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.6952] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.6952] manager: Networking is enabled by state file
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.6954] settings: Loaded settings plugin: keyfile (internal)
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.6959] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.6990] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7004] dhcp: init: Using DHCP client 'internal'
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7007] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7014] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7021] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7035] device (lo): Activation: starting connection 'lo' (02fbee44-d9d6-4838-9f7e-bcbf35b7d384)
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7044] device (eth0): carrier: link connected
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7048] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7053] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7054] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7060] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7067] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7072] device (eth1): carrier: link connected
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7076] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7081] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (005e0c67-cd91-5f06-b6a9-1e083d81271f) (indicated)
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7081] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7086] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7092] device (eth1): Activation: starting connection 'ci-private-network' (005e0c67-cd91-5f06-b6a9-1e083d81271f)
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7098] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 26 09:34:01 compute-1 systemd[1]: Started Network Manager.
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7107] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7109] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7111] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7113] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7116] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7119] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7121] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7124] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7130] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7134] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7151] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7172] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7184] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7189] dhcp4 (eth0): state changed new lease, address=38.102.83.217
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7192] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7198] device (lo): Activation: successful, device activated.
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7212] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 26 09:34:01 compute-1 systemd[1]: Starting Network Manager Wait Online...
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7306] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7314] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7316] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7320] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7322] device (eth1): Activation: successful, device activated.
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7347] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7348] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7354] manager: NetworkManager state is now CONNECTED_SITE
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7358] device (eth0): Activation: successful, device activated.
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7364] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 26 09:34:01 compute-1 NetworkManager[49073]: <info>  [1769420041.7368] manager: startup complete
Jan 26 09:34:01 compute-1 systemd[1]: Finished Network Manager Wait Online.
Jan 26 09:34:01 compute-1 sudo[49062]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:02 compute-1 sudo[49288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwcafcyobnfqiwnaevzvucbmcpjcwysa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420041.9396315-460-270458773812564/AnsiballZ_dnf.py'
Jan 26 09:34:02 compute-1 sudo[49288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:02 compute-1 python3.9[49290]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:34:09 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 09:34:09 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 26 09:34:09 compute-1 systemd[1]: Reloading.
Jan 26 09:34:09 compute-1 systemd-sysv-generator[49345]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:34:09 compute-1 systemd-rc-local-generator[49339]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:34:09 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 09:34:10 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 09:34:10 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 09:34:10 compute-1 systemd[1]: run-r2118850b088a4b949e0110ec8d64533f.service: Deactivated successfully.
Jan 26 09:34:10 compute-1 sudo[49288]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:11 compute-1 sudo[49748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-teavbdilmczajdqnzqisilfnwzeboclt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420051.4834476-496-175940063808356/AnsiballZ_stat.py'
Jan 26 09:34:11 compute-1 sudo[49748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:11 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 09:34:12 compute-1 python3.9[49750]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:34:12 compute-1 sudo[49748]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:12 compute-1 sudo[49900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxldivijvdvipzcmqelaxlkailrdrdqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420052.2955284-523-83130795498461/AnsiballZ_ini_file.py'
Jan 26 09:34:12 compute-1 sudo[49900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:12 compute-1 python3.9[49902]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:34:12 compute-1 sudo[49900]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:13 compute-1 sudo[50054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmjemniublfslmiyaligvaretuwvvjjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420053.2741559-553-261828729149894/AnsiballZ_ini_file.py'
Jan 26 09:34:13 compute-1 sudo[50054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:13 compute-1 python3.9[50056]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:34:13 compute-1 sudo[50054]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:14 compute-1 sudo[50206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjbncbgqpkxzqwvnsendirkcqjtouklc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420053.8757355-553-233181932627102/AnsiballZ_ini_file.py'
Jan 26 09:34:14 compute-1 sudo[50206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:14 compute-1 python3.9[50208]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:34:14 compute-1 sudo[50206]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:14 compute-1 sudo[50358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcgwdwzkknumxvmrgsatzfmojhkxidrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420054.5660536-598-122429126402578/AnsiballZ_ini_file.py'
Jan 26 09:34:14 compute-1 sudo[50358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:15 compute-1 python3.9[50360]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:34:15 compute-1 sudo[50358]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:15 compute-1 sudo[50510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fenrixubvjzzerhpuamxkihejhepesja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420055.4179747-598-17054205086179/AnsiballZ_ini_file.py'
Jan 26 09:34:15 compute-1 sudo[50510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:15 compute-1 python3.9[50512]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:34:15 compute-1 sudo[50510]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:16 compute-1 sudo[50662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkdlnbhajkgdqskjpehnhkfozvigfazu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420056.1948519-643-205495253638858/AnsiballZ_stat.py'
Jan 26 09:34:16 compute-1 sudo[50662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:16 compute-1 python3.9[50664]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:34:16 compute-1 sudo[50662]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:17 compute-1 sudo[50787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsluuvaazvliknfirvcyjgnkxsqpxdqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420056.1948519-643-205495253638858/AnsiballZ_copy.py'
Jan 26 09:34:17 compute-1 sudo[50787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:17 compute-1 python3.9[50789]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769420056.1948519-643-205495253638858/.source _original_basename=.tievaic0 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:34:17 compute-1 sudo[50787]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:17 compute-1 sshd-session[50712]: Connection closed by authenticating user root 104.248.198.71 port 42308 [preauth]
Jan 26 09:34:17 compute-1 sudo[50939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivfxhgztsucfvvychfxkoiighqvrpwqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420057.5695462-688-185540623010971/AnsiballZ_file.py'
Jan 26 09:34:17 compute-1 sudo[50939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:18 compute-1 python3.9[50941]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:34:18 compute-1 sudo[50939]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:18 compute-1 sudo[51091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhonxkwbuzplazroucezidfhmwysicln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420058.2193363-712-50349518296558/AnsiballZ_edpm_os_net_config_mappings.py'
Jan 26 09:34:18 compute-1 sudo[51091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:18 compute-1 python3.9[51093]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 26 09:34:18 compute-1 sudo[51091]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:19 compute-1 sudo[51243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipmayluwwoyahxgrynwablqykbehlbtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420059.0833228-739-21849786252176/AnsiballZ_file.py'
Jan 26 09:34:19 compute-1 sudo[51243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:19 compute-1 python3.9[51245]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:34:19 compute-1 sudo[51243]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:20 compute-1 sudo[51395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdfyecegdezshgkmangpfciyomzxwzob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420060.060739-769-280316725320697/AnsiballZ_stat.py'
Jan 26 09:34:20 compute-1 sudo[51395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:20 compute-1 sudo[51395]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:20 compute-1 sudo[51518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzppipchjejtlxiklhxirhaqkyzgkisz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420060.060739-769-280316725320697/AnsiballZ_copy.py'
Jan 26 09:34:20 compute-1 sudo[51518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:21 compute-1 sudo[51518]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:21 compute-1 sudo[51670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nidvhwapfualsnwjhovodotqxpqufbja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420061.3661592-814-17580824232049/AnsiballZ_slurp.py'
Jan 26 09:34:21 compute-1 sudo[51670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:22 compute-1 python3.9[51672]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 26 09:34:22 compute-1 sudo[51670]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:23 compute-1 sudo[51845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqaecykivthbxwtsokcqxvpcdarzhzxx ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420062.332005-841-155806645680929/async_wrapper.py j856592117221 300 /home/zuul/.ansible/tmp/ansible-tmp-1769420062.332005-841-155806645680929/AnsiballZ_edpm_os_net_config.py _'
Jan 26 09:34:23 compute-1 sudo[51845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:23 compute-1 ansible-async_wrapper.py[51847]: Invoked with j856592117221 300 /home/zuul/.ansible/tmp/ansible-tmp-1769420062.332005-841-155806645680929/AnsiballZ_edpm_os_net_config.py _
Jan 26 09:34:23 compute-1 ansible-async_wrapper.py[51850]: Starting module and watcher
Jan 26 09:34:23 compute-1 ansible-async_wrapper.py[51850]: Start watching 51851 (300)
Jan 26 09:34:23 compute-1 ansible-async_wrapper.py[51851]: Start module (51851)
Jan 26 09:34:23 compute-1 ansible-async_wrapper.py[51847]: Return async_wrapper task started.
Jan 26 09:34:23 compute-1 sudo[51845]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:23 compute-1 python3.9[51852]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 26 09:34:24 compute-1 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 26 09:34:24 compute-1 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 26 09:34:24 compute-1 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 26 09:34:24 compute-1 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 26 09:34:24 compute-1 kernel: cfg80211: failed to load regulatory.db
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.2481] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51853 uid=0 result="success"
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.2504] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51853 uid=0 result="success"
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3057] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3060] audit: op="connection-add" uuid="3cc5a451-bfd9-4013-aa49-a471bbbe10a9" name="br-ex-br" pid=51853 uid=0 result="success"
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3079] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3081] audit: op="connection-add" uuid="9153e954-4bca-40ba-b358-459a063898be" name="br-ex-port" pid=51853 uid=0 result="success"
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3098] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3100] audit: op="connection-add" uuid="d42f5f9a-7cf2-4296-9705-7315b5bff1d8" name="eth1-port" pid=51853 uid=0 result="success"
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3116] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3118] audit: op="connection-add" uuid="1fe6ef39-ff44-4c4a-b7a1-fca2c446f007" name="vlan20-port" pid=51853 uid=0 result="success"
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3131] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3134] audit: op="connection-add" uuid="939b3879-66d8-4592-a961-3770dc5e15ea" name="vlan21-port" pid=51853 uid=0 result="success"
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3148] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3150] audit: op="connection-add" uuid="b6a2cca4-fb3d-4b00-8296-0f5997fdc3ce" name="vlan22-port" pid=51853 uid=0 result="success"
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3162] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3164] audit: op="connection-add" uuid="5d1bc407-4dcf-4a9f-9d44-794d818d114b" name="vlan23-port" pid=51853 uid=0 result="success"
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3184] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,connection.autoconnect-priority,connection.timestamp" pid=51853 uid=0 result="success"
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3199] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3201] audit: op="connection-add" uuid="083c5da3-5b2f-4a68-b3d7-35d321110d3b" name="br-ex-if" pid=51853 uid=0 result="success"
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3340] audit: op="connection-update" uuid="005e0c67-cd91-5f06-b6a9-1e083d81271f" name="ci-private-network" args="ipv4.dns,ipv4.routing-rules,ipv4.routes,ipv4.method,ipv4.addresses,ipv4.never-default,ovs-interface.type,ipv6.dns,ipv6.routing-rules,ipv6.routes,ipv6.addr-gen-mode,ipv6.method,ipv6.addresses,connection.port-type,connection.controller,connection.slave-type,connection.timestamp,connection.master,ovs-external-ids.data" pid=51853 uid=0 result="success"
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3357] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3359] audit: op="connection-add" uuid="400efc6f-8710-4783-a66c-228c37e355f6" name="vlan20-if" pid=51853 uid=0 result="success"
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3374] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3376] audit: op="connection-add" uuid="ef305799-d499-43fd-b8a7-1fa9c3f71516" name="vlan21-if" pid=51853 uid=0 result="success"
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3397] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3399] audit: op="connection-add" uuid="13f21556-e6b4-46b6-9b87-1f2636ded6fb" name="vlan22-if" pid=51853 uid=0 result="success"
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3415] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3417] audit: op="connection-add" uuid="3926ba24-a6cd-49fb-8bda-4960a92c7804" name="vlan23-if" pid=51853 uid=0 result="success"
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3429] audit: op="connection-delete" uuid="01428c05-ae5c-3e90-b32b-9a07a8ac9b4b" name="Wired connection 1" pid=51853 uid=0 result="success"
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3442] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <warn>  [1769420065.3445] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3451] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3455] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (3cc5a451-bfd9-4013-aa49-a471bbbe10a9)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3457] audit: op="connection-activate" uuid="3cc5a451-bfd9-4013-aa49-a471bbbe10a9" name="br-ex-br" pid=51853 uid=0 result="success"
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3459] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <warn>  [1769420065.3460] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3465] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3468] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (9153e954-4bca-40ba-b358-459a063898be)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3470] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <warn>  [1769420065.3472] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3476] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3479] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (d42f5f9a-7cf2-4296-9705-7315b5bff1d8)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3481] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <warn>  [1769420065.3483] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3487] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3491] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (1fe6ef39-ff44-4c4a-b7a1-fca2c446f007)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3493] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <warn>  [1769420065.3494] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3499] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3503] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (939b3879-66d8-4592-a961-3770dc5e15ea)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3505] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <warn>  [1769420065.3506] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3511] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3514] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (b6a2cca4-fb3d-4b00-8296-0f5997fdc3ce)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3516] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <warn>  [1769420065.3518] device (vlan23)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3522] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3526] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (5d1bc407-4dcf-4a9f-9d44-794d818d114b)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3527] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3530] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3532] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3538] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <warn>  [1769420065.3540] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3544] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3548] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (083c5da3-5b2f-4a68-b3d7-35d321110d3b)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3550] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3554] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3556] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3558] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3559] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3569] device (eth1): disconnecting for new activation request.
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3571] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3574] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3577] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3581] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3585] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <warn>  [1769420065.3587] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3590] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3595] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (400efc6f-8710-4783-a66c-228c37e355f6)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3596] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3600] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3601] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3603] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3606] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <warn>  [1769420065.3608] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3611] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3617] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (ef305799-d499-43fd-b8a7-1fa9c3f71516)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3618] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3621] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3623] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3625] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3629] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <warn>  [1769420065.3630] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3634] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3639] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (13f21556-e6b4-46b6-9b87-1f2636ded6fb)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3640] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3644] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3646] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3648] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3651] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <warn>  [1769420065.3653] device (vlan23)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3656] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3661] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (3926ba24-a6cd-49fb-8bda-4960a92c7804)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3662] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3665] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3667] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3669] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3671] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3684] audit: op="device-reapply" interface="eth0" ifindex=2 args="802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,connection.autoconnect-priority" pid=51853 uid=0 result="success"
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3686] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3690] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3692] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3699] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3719] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 kernel: ovs-system: entered promiscuous mode
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3724] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3729] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3731] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3736] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3740] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3744] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 kernel: Timeout policy base is empty
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3746] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3751] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3755] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3759] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3761] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3766] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 systemd-udevd[51857]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3770] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3775] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3777] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3781] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3785] dhcp4 (eth0): canceled DHCP transaction
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3786] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3786] dhcp4 (eth0): state changed no lease
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3788] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3798] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3805] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51853 uid=0 result="fail" reason="Device is not activated"
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3811] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3818] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3825] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3832] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Jan 26 09:34:25 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3834] dhcp4 (eth0): state changed new lease, address=38.102.83.217
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.3888] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4094] device (eth1): Activation: starting connection 'ci-private-network' (005e0c67-cd91-5f06-b6a9-1e083d81271f)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4099] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4113] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4116] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4121] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4136] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4143] device (eth1): state change: config -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4145] device (eth1): released from controller device eth1
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4152] device (eth1): disconnecting for new activation request.
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4153] audit: op="connection-activate" uuid="005e0c67-cd91-5f06-b6a9-1e083d81271f" name="ci-private-network" pid=51853 uid=0 result="success"
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4153] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4154] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4155] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4156] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4158] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4159] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4161] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4167] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4170] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4173] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4177] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4185] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 09:34:25 compute-1 kernel: br-ex: entered promiscuous mode
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4188] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4191] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4195] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4200] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4205] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4209] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4232] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4242] device (eth1): Activation: starting connection 'ci-private-network' (005e0c67-cd91-5f06-b6a9-1e083d81271f)
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4245] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51853 uid=0 result="success"
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4249] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4252] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4257] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4269] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4271] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 kernel: vlan22: entered promiscuous mode
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4450] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4458] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4470] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 kernel: vlan20: entered promiscuous mode
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4476] device (eth1): Activation: successful, device activated.
Jan 26 09:34:25 compute-1 systemd-udevd[51859]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4501] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4532] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4537] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4543] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4624] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4628] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 26 09:34:25 compute-1 kernel: vlan21: entered promiscuous mode
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4661] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4669] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4700] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4702] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4706] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4712] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4719] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4723] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 09:34:25 compute-1 kernel: vlan23: entered promiscuous mode
Jan 26 09:34:25 compute-1 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4810] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4823] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4844] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.4860] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.5034] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.5038] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.5043] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.5051] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.5053] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 09:34:25 compute-1 NetworkManager[49073]: <info>  [1769420065.5059] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 09:34:26 compute-1 NetworkManager[49073]: <info>  [1769420066.6771] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51853 uid=0 result="success"
Jan 26 09:34:26 compute-1 sudo[52213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjzhdyoldtnpatjkwlrvaizdkcruxdzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420066.3560095-841-76976530405435/AnsiballZ_async_status.py'
Jan 26 09:34:26 compute-1 sudo[52213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:26 compute-1 NetworkManager[49073]: <info>  [1769420066.8431] checkpoint[0x563bd18b5950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 26 09:34:26 compute-1 NetworkManager[49073]: <info>  [1769420066.8435] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51853 uid=0 result="success"
Jan 26 09:34:26 compute-1 python3.9[52215]: ansible-ansible.legacy.async_status Invoked with jid=j856592117221.51847 mode=status _async_dir=/root/.ansible_async
Jan 26 09:34:26 compute-1 sudo[52213]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:27 compute-1 NetworkManager[49073]: <info>  [1769420067.1716] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51853 uid=0 result="success"
Jan 26 09:34:27 compute-1 NetworkManager[49073]: <info>  [1769420067.1730] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51853 uid=0 result="success"
Jan 26 09:34:27 compute-1 NetworkManager[49073]: <info>  [1769420067.4237] audit: op="networking-control" arg="global-dns-configuration" pid=51853 uid=0 result="success"
Jan 26 09:34:27 compute-1 NetworkManager[49073]: <info>  [1769420067.4271] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 26 09:34:27 compute-1 NetworkManager[49073]: <info>  [1769420067.4309] audit: op="networking-control" arg="global-dns-configuration" pid=51853 uid=0 result="success"
Jan 26 09:34:27 compute-1 NetworkManager[49073]: <info>  [1769420067.4329] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51853 uid=0 result="success"
Jan 26 09:34:27 compute-1 NetworkManager[49073]: <info>  [1769420067.5719] checkpoint[0x563bd18b5a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 26 09:34:27 compute-1 NetworkManager[49073]: <info>  [1769420067.5723] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51853 uid=0 result="success"
Jan 26 09:34:27 compute-1 ansible-async_wrapper.py[51851]: Module complete (51851)
Jan 26 09:34:28 compute-1 ansible-async_wrapper.py[51850]: Done in kid B.
Jan 26 09:34:30 compute-1 sudo[52319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfpicuyemkbbptnetypywgelespvomde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420066.3560095-841-76976530405435/AnsiballZ_async_status.py'
Jan 26 09:34:30 compute-1 sudo[52319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:30 compute-1 python3.9[52321]: ansible-ansible.legacy.async_status Invoked with jid=j856592117221.51847 mode=status _async_dir=/root/.ansible_async
Jan 26 09:34:30 compute-1 sudo[52319]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:30 compute-1 sudo[52420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsysdfelsxcyzgqoboewkmkbvyokuiad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420066.3560095-841-76976530405435/AnsiballZ_async_status.py'
Jan 26 09:34:30 compute-1 sudo[52420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:30 compute-1 python3.9[52423]: ansible-ansible.legacy.async_status Invoked with jid=j856592117221.51847 mode=cleanup _async_dir=/root/.ansible_async
Jan 26 09:34:30 compute-1 sudo[52420]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:31 compute-1 sudo[52573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-soziqrkllcieyyqdvvxzbkqdswydpuev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420071.1945255-922-195691267603125/AnsiballZ_stat.py'
Jan 26 09:34:31 compute-1 sudo[52573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:31 compute-1 python3.9[52575]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:34:31 compute-1 sudo[52573]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:31 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 09:34:31 compute-1 sudo[52698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxbaneontyytvixmnymzmfelwjubwffs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420071.1945255-922-195691267603125/AnsiballZ_copy.py'
Jan 26 09:34:31 compute-1 sudo[52698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:31 compute-1 sshd-session[52346]: Connection closed by authenticating user root 178.62.249.31 port 49422 [preauth]
Jan 26 09:34:32 compute-1 python3.9[52700]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769420071.1945255-922-195691267603125/.source.returncode _original_basename=.ozvuh3cj follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:34:32 compute-1 sudo[52698]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:32 compute-1 sudo[52850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilkxswopwgmbnwkbaeqbbyyonqtahsxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420072.4897892-970-35151221593775/AnsiballZ_stat.py'
Jan 26 09:34:32 compute-1 sudo[52850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:32 compute-1 python3.9[52852]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:34:32 compute-1 sudo[52850]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:33 compute-1 sudo[52973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggperwmmbjqocglbepinpznsrrrfkbis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420072.4897892-970-35151221593775/AnsiballZ_copy.py'
Jan 26 09:34:33 compute-1 sudo[52973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:33 compute-1 python3.9[52975]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769420072.4897892-970-35151221593775/.source.cfg _original_basename=._0xi2g7z follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:34:33 compute-1 sudo[52973]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:33 compute-1 sudo[53126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kilhfqarpqjtuvnyauqnrzjybflxfzrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420073.7295883-1015-23895676507193/AnsiballZ_systemd.py'
Jan 26 09:34:34 compute-1 sudo[53126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:34 compute-1 python3.9[53128]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 09:34:34 compute-1 systemd[1]: Reloading Network Manager...
Jan 26 09:34:34 compute-1 NetworkManager[49073]: <info>  [1769420074.4272] audit: op="reload" arg="0" pid=53132 uid=0 result="success"
Jan 26 09:34:34 compute-1 NetworkManager[49073]: <info>  [1769420074.4280] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 26 09:34:34 compute-1 systemd[1]: Reloaded Network Manager.
Jan 26 09:34:34 compute-1 sudo[53126]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:34 compute-1 sshd-session[45073]: Connection closed by 192.168.122.30 port 46840
Jan 26 09:34:34 compute-1 sshd-session[45070]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:34:34 compute-1 systemd[1]: session-11.scope: Deactivated successfully.
Jan 26 09:34:34 compute-1 systemd[1]: session-11.scope: Consumed 53.693s CPU time.
Jan 26 09:34:34 compute-1 systemd-logind[783]: Session 11 logged out. Waiting for processes to exit.
Jan 26 09:34:34 compute-1 systemd-logind[783]: Removed session 11.
Jan 26 09:34:39 compute-1 sshd-session[53162]: Connection closed by authenticating user root 152.42.136.110 port 60480 [preauth]
Jan 26 09:34:40 compute-1 sshd-session[53164]: Accepted publickey for zuul from 192.168.122.30 port 51658 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:34:40 compute-1 systemd-logind[783]: New session 12 of user zuul.
Jan 26 09:34:40 compute-1 systemd[1]: Started Session 12 of User zuul.
Jan 26 09:34:40 compute-1 sshd-session[53164]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:34:41 compute-1 python3.9[53318]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:34:42 compute-1 python3.9[53472]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 09:34:44 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 09:34:44 compute-1 python3.9[53667]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:34:45 compute-1 sshd-session[53167]: Connection closed by 192.168.122.30 port 51658
Jan 26 09:34:45 compute-1 sshd-session[53164]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:34:45 compute-1 systemd[1]: session-12.scope: Deactivated successfully.
Jan 26 09:34:45 compute-1 systemd[1]: session-12.scope: Consumed 2.240s CPU time.
Jan 26 09:34:45 compute-1 systemd-logind[783]: Session 12 logged out. Waiting for processes to exit.
Jan 26 09:34:45 compute-1 systemd-logind[783]: Removed session 12.
Jan 26 09:34:50 compute-1 sshd-session[53695]: Accepted publickey for zuul from 192.168.122.30 port 59312 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:34:50 compute-1 systemd-logind[783]: New session 13 of user zuul.
Jan 26 09:34:50 compute-1 systemd[1]: Started Session 13 of User zuul.
Jan 26 09:34:50 compute-1 sshd-session[53695]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:34:51 compute-1 python3.9[53848]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:34:52 compute-1 python3.9[54002]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:34:52 compute-1 sudo[54157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvcbhqucijrxjnmkivyouookdyunxgfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420092.5898883-76-65931924878134/AnsiballZ_setup.py'
Jan 26 09:34:52 compute-1 sudo[54157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:53 compute-1 python3.9[54159]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 09:34:53 compute-1 sudo[54157]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:53 compute-1 sudo[54241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otdwmeismfznqcqodzgricdxuvzsexmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420092.5898883-76-65931924878134/AnsiballZ_dnf.py'
Jan 26 09:34:53 compute-1 sudo[54241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:53 compute-1 python3.9[54243]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:34:55 compute-1 sudo[54241]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:55 compute-1 sudo[54394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaacaxjvvydrhsotklbkmfpparxvfywa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420095.566776-112-20941798493481/AnsiballZ_setup.py'
Jan 26 09:34:55 compute-1 sudo[54394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:56 compute-1 python3.9[54396]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 09:34:56 compute-1 sudo[54394]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:57 compute-1 sudo[54590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmbdflknwmxonplqdsnspnwjikqeirzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420096.7632158-145-123974729555611/AnsiballZ_file.py'
Jan 26 09:34:57 compute-1 sudo[54590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:57 compute-1 python3.9[54592]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:34:57 compute-1 sudo[54590]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:57 compute-1 sudo[54742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfrwjcounlzubrwwjwavhtvwluvwittd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420097.5643868-169-47109739342806/AnsiballZ_command.py'
Jan 26 09:34:57 compute-1 sudo[54742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:58 compute-1 python3.9[54744]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:34:58 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat2610399325-merged.mount: Deactivated successfully.
Jan 26 09:34:58 compute-1 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck1689028226-merged.mount: Deactivated successfully.
Jan 26 09:34:58 compute-1 podman[54745]: 2026-01-26 09:34:58.293059283 +0000 UTC m=+0.066136182 system refresh
Jan 26 09:34:58 compute-1 sudo[54742]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:58 compute-1 sudo[54904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvgoxagtbasnzkzcqqctmekicsvcfdrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420098.4922214-193-194955250657632/AnsiballZ_stat.py'
Jan 26 09:34:58 compute-1 sudo[54904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:59 compute-1 python3.9[54906]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:34:59 compute-1 sudo[54904]: pam_unix(sudo:session): session closed for user root
Jan 26 09:34:59 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 09:34:59 compute-1 sudo[55027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjrnevufgtdirlivcitgikpfylupgmmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420098.4922214-193-194955250657632/AnsiballZ_copy.py'
Jan 26 09:34:59 compute-1 sudo[55027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:34:59 compute-1 python3.9[55029]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420098.4922214-193-194955250657632/.source.json follow=False _original_basename=podman_network_config.j2 checksum=4900a580bdba2a04d3b8e5150265ba925035f2ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:34:59 compute-1 sudo[55027]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:00 compute-1 sudo[55179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtvwtfoevpkoucrtkvbpzzxtzkqyqdtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420099.9602005-238-190889517855/AnsiballZ_stat.py'
Jan 26 09:35:00 compute-1 sudo[55179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:00 compute-1 python3.9[55181]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:35:00 compute-1 sudo[55179]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:00 compute-1 sudo[55302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kljevsseonkcluscydzvwxrukunytbtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420099.9602005-238-190889517855/AnsiballZ_copy.py'
Jan 26 09:35:00 compute-1 sudo[55302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:00 compute-1 python3.9[55304]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769420099.9602005-238-190889517855/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:35:00 compute-1 sudo[55302]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:01 compute-1 sudo[55454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjsfvbazdwbobvazgogfirbpcbpbeaya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420101.1894028-286-256814938011138/AnsiballZ_ini_file.py'
Jan 26 09:35:01 compute-1 sudo[55454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:01 compute-1 python3.9[55456]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:35:01 compute-1 sudo[55454]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:02 compute-1 sudo[55606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwggqvfndyudiculoiqrhiiwnnxaqmjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420101.9212577-286-46891071298524/AnsiballZ_ini_file.py'
Jan 26 09:35:02 compute-1 sudo[55606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:02 compute-1 python3.9[55608]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:35:02 compute-1 sudo[55606]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:02 compute-1 sudo[55758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkjhypqyszqqcbejujpdrhjntvavvhob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420102.5165648-286-100181253703040/AnsiballZ_ini_file.py'
Jan 26 09:35:02 compute-1 sudo[55758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:02 compute-1 python3.9[55760]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:35:02 compute-1 sudo[55758]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:03 compute-1 sudo[55910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbjspicmxllrhvdeopbsknnmekywiyux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420103.119085-286-74413596804170/AnsiballZ_ini_file.py'
Jan 26 09:35:03 compute-1 sudo[55910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:03 compute-1 python3.9[55912]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:35:03 compute-1 sudo[55910]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:04 compute-1 sudo[56062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-saprkaomvsknnqveelkxcvofexjexzyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420103.889451-379-70598420125262/AnsiballZ_dnf.py'
Jan 26 09:35:04 compute-1 sudo[56062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:04 compute-1 python3.9[56064]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:35:05 compute-1 sudo[56062]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:06 compute-1 sudo[56215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccxpmwakxisbvawptspnryrbmyavcmif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420106.3586843-412-189355422868839/AnsiballZ_setup.py'
Jan 26 09:35:06 compute-1 sudo[56215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:06 compute-1 python3.9[56217]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:35:06 compute-1 sudo[56215]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:07 compute-1 sudo[56369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nksibnxeserrsegcfsakrlbjoqhhuhja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420107.1573396-436-60173492106003/AnsiballZ_stat.py'
Jan 26 09:35:07 compute-1 sudo[56369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:07 compute-1 python3.9[56371]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:35:07 compute-1 sudo[56369]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:08 compute-1 sudo[56521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjgbcohzpodbxzaevsqlhfwwqqsdshja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420107.8908877-463-268706732377212/AnsiballZ_stat.py'
Jan 26 09:35:08 compute-1 sudo[56521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:08 compute-1 python3.9[56523]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:35:08 compute-1 sudo[56521]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:08 compute-1 sudo[56673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czfsznlqqnwbwpnylslliivkjbkypbps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420108.6532874-493-156201969361949/AnsiballZ_command.py'
Jan 26 09:35:08 compute-1 sudo[56673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:09 compute-1 python3.9[56675]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:35:09 compute-1 sudo[56673]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:09 compute-1 sudo[56826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbqxzgeoljwqbuttlspcjvvdwsxuqkqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420109.4349725-523-271249472659321/AnsiballZ_service_facts.py'
Jan 26 09:35:09 compute-1 sudo[56826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:10 compute-1 python3.9[56828]: ansible-service_facts Invoked
Jan 26 09:35:10 compute-1 network[56845]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 09:35:10 compute-1 network[56846]: 'network-scripts' will be removed from distribution in near future.
Jan 26 09:35:10 compute-1 network[56847]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 09:35:12 compute-1 sshd-session[56882]: Connection closed by authenticating user root 104.248.198.71 port 55572 [preauth]
Jan 26 09:35:13 compute-1 sudo[56826]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:15 compute-1 sudo[57132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnzleuwazdqppvqnvqigjsvojhmmpmps ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1769420114.6852112-568-29961806740987/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1769420114.6852112-568-29961806740987/args'
Jan 26 09:35:15 compute-1 sudo[57132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:15 compute-1 sudo[57132]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:15 compute-1 sudo[57299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aezjzmdpkimxmizvxqcliayihphuikzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420115.418732-601-205110095376772/AnsiballZ_dnf.py'
Jan 26 09:35:15 compute-1 sudo[57299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:15 compute-1 python3.9[57301]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:35:17 compute-1 sudo[57299]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:18 compute-1 sudo[57452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkosnetmzededkuhtjywoiuiilplywgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420118.1052656-640-124785569307464/AnsiballZ_package_facts.py'
Jan 26 09:35:18 compute-1 sudo[57452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:19 compute-1 python3.9[57454]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 26 09:35:19 compute-1 sudo[57452]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:20 compute-1 sudo[57606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffktzgheviewjhbzpohcojrueexotnrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420120.0401855-671-249412341059297/AnsiballZ_stat.py'
Jan 26 09:35:20 compute-1 sudo[57606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:20 compute-1 python3.9[57608]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:35:20 compute-1 sudo[57606]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:20 compute-1 sudo[57732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxvremnhwpacbhscxqkvzfsbqmntzgdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420120.0401855-671-249412341059297/AnsiballZ_copy.py'
Jan 26 09:35:20 compute-1 sudo[57732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:21 compute-1 python3.9[57734]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769420120.0401855-671-249412341059297/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:35:21 compute-1 sudo[57732]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:21 compute-1 sshd-session[57555]: Connection closed by authenticating user root 178.62.249.31 port 34984 [preauth]
Jan 26 09:35:21 compute-1 sudo[57886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzhgvlmmqwlriwslvoqcetpjbwsbqsps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420121.395794-716-179553374390144/AnsiballZ_stat.py'
Jan 26 09:35:21 compute-1 sudo[57886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:21 compute-1 python3.9[57888]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:35:21 compute-1 sudo[57886]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:22 compute-1 sudo[58013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdmwgxugrsqebcurihntkkfqrytbdxkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420121.395794-716-179553374390144/AnsiballZ_copy.py'
Jan 26 09:35:22 compute-1 sudo[58013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:22 compute-1 python3.9[58015]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769420121.395794-716-179553374390144/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:35:22 compute-1 sudo[58013]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:22 compute-1 sshd-session[57961]: Connection closed by authenticating user root 152.42.136.110 port 52804 [preauth]
Jan 26 09:35:24 compute-1 sudo[58167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agaexbbbyxlryybijsdkoueasbbbdhle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420123.5204399-779-218877015931869/AnsiballZ_lineinfile.py'
Jan 26 09:35:24 compute-1 sudo[58167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:24 compute-1 python3.9[58169]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:35:24 compute-1 sudo[58167]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:25 compute-1 sudo[58321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxnjlfbtqexojnpouudfsakqcvxbuduv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420125.2885249-824-203831670323543/AnsiballZ_setup.py'
Jan 26 09:35:25 compute-1 sudo[58321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:25 compute-1 python3.9[58323]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 09:35:26 compute-1 sudo[58321]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:26 compute-1 sudo[58405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-teycxdztlluvgrfbogzzylgvixqntlrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420125.2885249-824-203831670323543/AnsiballZ_systemd.py'
Jan 26 09:35:26 compute-1 sudo[58405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:26 compute-1 python3.9[58407]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:35:26 compute-1 sudo[58405]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:28 compute-1 sudo[58559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bawjtlxfvbbmvunservhndrwalmwtfdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420128.697242-872-262404180362114/AnsiballZ_setup.py'
Jan 26 09:35:28 compute-1 sudo[58559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:29 compute-1 python3.9[58561]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 09:35:29 compute-1 sudo[58559]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:29 compute-1 sudo[58643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guvvapfiwxtkpdaaxluyulfkxjeknljh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420128.697242-872-262404180362114/AnsiballZ_systemd.py'
Jan 26 09:35:29 compute-1 sudo[58643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:30 compute-1 python3.9[58645]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 09:35:30 compute-1 systemd[1]: Stopping NTP client/server...
Jan 26 09:35:30 compute-1 chronyd[791]: chronyd exiting
Jan 26 09:35:30 compute-1 systemd[1]: chronyd.service: Deactivated successfully.
Jan 26 09:35:30 compute-1 systemd[1]: Stopped NTP client/server.
Jan 26 09:35:30 compute-1 systemd[1]: Starting NTP client/server...
Jan 26 09:35:30 compute-1 chronyd[58653]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 26 09:35:30 compute-1 chronyd[58653]: Frequency -26.722 +/- 0.705 ppm read from /var/lib/chrony/drift
Jan 26 09:35:30 compute-1 chronyd[58653]: Loaded seccomp filter (level 2)
Jan 26 09:35:30 compute-1 systemd[1]: Started NTP client/server.
Jan 26 09:35:30 compute-1 sudo[58643]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:30 compute-1 sshd-session[53698]: Connection closed by 192.168.122.30 port 59312
Jan 26 09:35:30 compute-1 sshd-session[53695]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:35:30 compute-1 systemd[1]: session-13.scope: Deactivated successfully.
Jan 26 09:35:30 compute-1 systemd[1]: session-13.scope: Consumed 25.672s CPU time.
Jan 26 09:35:30 compute-1 systemd-logind[783]: Session 13 logged out. Waiting for processes to exit.
Jan 26 09:35:30 compute-1 systemd-logind[783]: Removed session 13.
Jan 26 09:35:35 compute-1 sshd-session[58679]: Accepted publickey for zuul from 192.168.122.30 port 46618 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:35:35 compute-1 systemd-logind[783]: New session 14 of user zuul.
Jan 26 09:35:35 compute-1 systemd[1]: Started Session 14 of User zuul.
Jan 26 09:35:35 compute-1 sshd-session[58679]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:35:36 compute-1 sudo[58832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjnbwvqblpcnwtxwiqfkcmvaxwpjhbze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420136.076316-22-217727601631198/AnsiballZ_file.py'
Jan 26 09:35:36 compute-1 sudo[58832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:36 compute-1 python3.9[58834]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:35:36 compute-1 sudo[58832]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:37 compute-1 sudo[58984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuzjwepxmhnbswiqlvhotdxunlcdmoxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420136.9927318-58-154637409406390/AnsiballZ_stat.py'
Jan 26 09:35:37 compute-1 sudo[58984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:37 compute-1 python3.9[58986]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:35:37 compute-1 sudo[58984]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:38 compute-1 sudo[59107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svftxbikwehktkcilmuupgsurjhejlrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420136.9927318-58-154637409406390/AnsiballZ_copy.py'
Jan 26 09:35:38 compute-1 sudo[59107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:38 compute-1 python3.9[59109]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769420136.9927318-58-154637409406390/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:35:38 compute-1 sudo[59107]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:38 compute-1 sshd-session[58682]: Connection closed by 192.168.122.30 port 46618
Jan 26 09:35:38 compute-1 sshd-session[58679]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:35:38 compute-1 systemd[1]: session-14.scope: Deactivated successfully.
Jan 26 09:35:38 compute-1 systemd[1]: session-14.scope: Consumed 1.594s CPU time.
Jan 26 09:35:38 compute-1 systemd-logind[783]: Session 14 logged out. Waiting for processes to exit.
Jan 26 09:35:38 compute-1 systemd-logind[783]: Removed session 14.
Jan 26 09:35:45 compute-1 sshd-session[59134]: Accepted publickey for zuul from 192.168.122.30 port 58126 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:35:45 compute-1 systemd-logind[783]: New session 15 of user zuul.
Jan 26 09:35:45 compute-1 systemd[1]: Started Session 15 of User zuul.
Jan 26 09:35:45 compute-1 sshd-session[59134]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:35:46 compute-1 python3.9[59287]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:35:47 compute-1 sudo[59441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enrirpngmpfuqjdnqslrowdizdvaeaar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420146.7682035-55-168826826139359/AnsiballZ_file.py'
Jan 26 09:35:47 compute-1 sudo[59441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:47 compute-1 python3.9[59443]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:35:47 compute-1 sudo[59441]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:48 compute-1 sudo[59616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kddktsjgnjtjmwgjkvvxqzljwqpxxuma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420147.6604342-79-150054414921999/AnsiballZ_stat.py'
Jan 26 09:35:48 compute-1 sudo[59616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:48 compute-1 python3.9[59618]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:35:48 compute-1 sudo[59616]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:49 compute-1 sudo[59739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dckgihetoncvnczwlybyfewiavoevrzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420147.6604342-79-150054414921999/AnsiballZ_copy.py'
Jan 26 09:35:49 compute-1 sudo[59739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:49 compute-1 python3.9[59741]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769420147.6604342-79-150054414921999/.source.json _original_basename=.35u2wgzl follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:35:49 compute-1 sudo[59739]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:50 compute-1 sudo[59891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycqhzlbvcgnloxquarlxjqgrazgvfgol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420150.1938748-148-206798390406656/AnsiballZ_stat.py'
Jan 26 09:35:50 compute-1 sudo[59891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:50 compute-1 python3.9[59893]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:35:50 compute-1 sudo[59891]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:50 compute-1 sudo[60014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzyjytkxvbotbtdbzvkcpxmnozjhvayj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420150.1938748-148-206798390406656/AnsiballZ_copy.py'
Jan 26 09:35:50 compute-1 sudo[60014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:51 compute-1 python3.9[60016]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769420150.1938748-148-206798390406656/.source _original_basename=.wzyp2tud follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:35:51 compute-1 sudo[60014]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:51 compute-1 sudo[60166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmmonediupkkzjgkslwlauugwdjxdmqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420151.543874-196-66814088694718/AnsiballZ_file.py'
Jan 26 09:35:51 compute-1 sudo[60166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:51 compute-1 python3.9[60168]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:35:52 compute-1 sudo[60166]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:52 compute-1 sudo[60318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hujvuibxphvipcyicmdxktgfzmlencaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420152.2653298-220-90465004562318/AnsiballZ_stat.py'
Jan 26 09:35:52 compute-1 sudo[60318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:52 compute-1 python3.9[60320]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:35:52 compute-1 sudo[60318]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:53 compute-1 sudo[60441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjdxfwoxoiwbxxrwmorsrebcvkmwwkyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420152.2653298-220-90465004562318/AnsiballZ_copy.py'
Jan 26 09:35:53 compute-1 sudo[60441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:53 compute-1 python3.9[60443]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769420152.2653298-220-90465004562318/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:35:53 compute-1 sudo[60441]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:53 compute-1 sudo[60593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kropqlpluuvsfiaoaowhmwworshuhswv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420153.4184158-220-34844840439307/AnsiballZ_stat.py'
Jan 26 09:35:53 compute-1 sudo[60593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:53 compute-1 python3.9[60595]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:35:53 compute-1 sudo[60593]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:54 compute-1 sudo[60716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odqlhkhtotpoojbdbyftvoctadypvfqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420153.4184158-220-34844840439307/AnsiballZ_copy.py'
Jan 26 09:35:54 compute-1 sudo[60716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:54 compute-1 python3.9[60718]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769420153.4184158-220-34844840439307/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:35:54 compute-1 sudo[60716]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:54 compute-1 sudo[60868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynqxlqxvkmlkfhujthwamxnjlgjpnjdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420154.5687604-307-229714546497700/AnsiballZ_file.py'
Jan 26 09:35:54 compute-1 sudo[60868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:55 compute-1 python3.9[60870]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:35:55 compute-1 sudo[60868]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:55 compute-1 sudo[61020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afckusywbofysexnffrrzlnwuohuivyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420155.225621-331-137851243582868/AnsiballZ_stat.py'
Jan 26 09:35:55 compute-1 sudo[61020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:55 compute-1 python3.9[61022]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:35:55 compute-1 sudo[61020]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:56 compute-1 sudo[61143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsbdmbyuytbdpabhbsdcgubfhmpdilov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420155.225621-331-137851243582868/AnsiballZ_copy.py'
Jan 26 09:35:56 compute-1 sudo[61143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:56 compute-1 python3.9[61145]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420155.225621-331-137851243582868/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:35:56 compute-1 sudo[61143]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:56 compute-1 sudo[61295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfynxablnnailvjoqlpfyfkpwctcvugq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420156.4243226-376-177409186885814/AnsiballZ_stat.py'
Jan 26 09:35:56 compute-1 sudo[61295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:56 compute-1 python3.9[61297]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:35:56 compute-1 sudo[61295]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:57 compute-1 sudo[61418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cchjfoxrlhwzcyxkprszznivpmwrrgwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420156.4243226-376-177409186885814/AnsiballZ_copy.py'
Jan 26 09:35:57 compute-1 sudo[61418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:57 compute-1 python3.9[61420]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420156.4243226-376-177409186885814/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:35:57 compute-1 sudo[61418]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:58 compute-1 sudo[61570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdjgbuubsyiekftgllwifrobvntqdeho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420157.6422293-421-37726633870385/AnsiballZ_systemd.py'
Jan 26 09:35:58 compute-1 sudo[61570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:58 compute-1 python3.9[61572]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:35:58 compute-1 systemd[1]: Reloading.
Jan 26 09:35:58 compute-1 systemd-rc-local-generator[61598]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:35:58 compute-1 systemd-sysv-generator[61602]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:35:58 compute-1 systemd[1]: Reloading.
Jan 26 09:35:59 compute-1 systemd-rc-local-generator[61635]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:35:59 compute-1 systemd-sysv-generator[61639]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:35:59 compute-1 systemd[1]: Starting EDPM Container Shutdown...
Jan 26 09:35:59 compute-1 systemd[1]: Finished EDPM Container Shutdown.
Jan 26 09:35:59 compute-1 sudo[61570]: pam_unix(sudo:session): session closed for user root
Jan 26 09:35:59 compute-1 sudo[61797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liepirzgpizewuptvslsthojdotrjkxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420159.3804204-445-103471130912335/AnsiballZ_stat.py'
Jan 26 09:35:59 compute-1 sudo[61797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:35:59 compute-1 python3.9[61799]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:35:59 compute-1 sudo[61797]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:00 compute-1 sudo[61920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkohzcxzstmyldtxadwnnrmlkrpcpicn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420159.3804204-445-103471130912335/AnsiballZ_copy.py'
Jan 26 09:36:00 compute-1 sudo[61920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:00 compute-1 python3.9[61922]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420159.3804204-445-103471130912335/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:36:00 compute-1 sudo[61920]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:00 compute-1 sudo[62072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eopmevdlczsglplonlqwehdjrcshpepu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420160.564569-490-268228402200051/AnsiballZ_stat.py'
Jan 26 09:36:00 compute-1 sudo[62072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:01 compute-1 python3.9[62074]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:36:01 compute-1 sudo[62072]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:01 compute-1 sudo[62195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eptoepjrixyhlokhfrjjoaygyonxvmyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420160.564569-490-268228402200051/AnsiballZ_copy.py'
Jan 26 09:36:01 compute-1 sudo[62195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:01 compute-1 python3.9[62197]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420160.564569-490-268228402200051/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:36:01 compute-1 sudo[62195]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:02 compute-1 sudo[62347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdoowmtmjsexnzgptlqvclkxvbqvygzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420161.737687-535-240669198172883/AnsiballZ_systemd.py'
Jan 26 09:36:02 compute-1 sudo[62347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:02 compute-1 python3.9[62349]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:36:02 compute-1 systemd[1]: Reloading.
Jan 26 09:36:02 compute-1 systemd-rc-local-generator[62376]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:36:02 compute-1 systemd-sysv-generator[62379]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:36:02 compute-1 systemd[1]: Reloading.
Jan 26 09:36:02 compute-1 systemd-rc-local-generator[62413]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:36:02 compute-1 systemd-sysv-generator[62417]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:36:02 compute-1 systemd[1]: Starting Create netns directory...
Jan 26 09:36:02 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 26 09:36:02 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 26 09:36:02 compute-1 systemd[1]: Finished Create netns directory.
Jan 26 09:36:02 compute-1 sudo[62347]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:03 compute-1 python3.9[62574]: ansible-ansible.builtin.service_facts Invoked
Jan 26 09:36:03 compute-1 network[62591]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 09:36:03 compute-1 network[62592]: 'network-scripts' will be removed from distribution in near future.
Jan 26 09:36:03 compute-1 network[62593]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 09:36:05 compute-1 sshd-session[62627]: Connection closed by authenticating user root 152.42.136.110 port 41884 [preauth]
Jan 26 09:36:06 compute-1 sshd-session[62636]: Connection closed by authenticating user root 104.248.198.71 port 53468 [preauth]
Jan 26 09:36:07 compute-1 sudo[62857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxssjynkanngxrkdpaynzukeyhsokqgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420167.5349245-583-44196461276351/AnsiballZ_systemd.py'
Jan 26 09:36:07 compute-1 sudo[62857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:08 compute-1 python3.9[62859]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:36:08 compute-1 systemd[1]: Reloading.
Jan 26 09:36:08 compute-1 systemd-rc-local-generator[62888]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:36:08 compute-1 systemd-sysv-generator[62891]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:36:08 compute-1 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 26 09:36:08 compute-1 iptables.init[62898]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 26 09:36:08 compute-1 iptables.init[62898]: iptables: Flushing firewall rules: [  OK  ]
Jan 26 09:36:08 compute-1 systemd[1]: iptables.service: Deactivated successfully.
Jan 26 09:36:08 compute-1 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 26 09:36:08 compute-1 sudo[62857]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:09 compute-1 sudo[63092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrssdbmbqtkbwxedwoeiqvcychwpzekt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420168.8100598-583-98766740215341/AnsiballZ_systemd.py'
Jan 26 09:36:09 compute-1 sudo[63092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:09 compute-1 python3.9[63094]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:36:09 compute-1 sudo[63092]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:10 compute-1 sudo[63246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwuhkjtxqvnegolzlnmylqrgkvogtdot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420169.9007382-631-78804736150081/AnsiballZ_systemd.py'
Jan 26 09:36:10 compute-1 sudo[63246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:10 compute-1 python3.9[63248]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:36:10 compute-1 systemd[1]: Reloading.
Jan 26 09:36:10 compute-1 systemd-rc-local-generator[63277]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:36:10 compute-1 systemd-sysv-generator[63281]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:36:10 compute-1 systemd[1]: Starting Netfilter Tables...
Jan 26 09:36:10 compute-1 systemd[1]: Finished Netfilter Tables.
Jan 26 09:36:10 compute-1 sudo[63246]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:11 compute-1 sudo[63439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfivrrljedtpvcdvwticczmfrabarzjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420170.9613774-655-91150473567371/AnsiballZ_command.py'
Jan 26 09:36:11 compute-1 sudo[63439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:11 compute-1 python3.9[63441]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:36:11 compute-1 sudo[63439]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:11 compute-1 sshd-session[63249]: Connection closed by authenticating user root 178.62.249.31 port 33518 [preauth]
Jan 26 09:36:12 compute-1 sudo[63592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghysbniodmuntubavwcujzuiwogcxvgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420172.1761847-697-161030953143602/AnsiballZ_stat.py'
Jan 26 09:36:12 compute-1 sudo[63592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:12 compute-1 python3.9[63594]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:36:12 compute-1 sudo[63592]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:13 compute-1 sudo[63717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auehrapakkpwkizvvtscvarqmmohwwqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420172.1761847-697-161030953143602/AnsiballZ_copy.py'
Jan 26 09:36:13 compute-1 sudo[63717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:13 compute-1 python3.9[63719]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769420172.1761847-697-161030953143602/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:36:13 compute-1 sudo[63717]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:13 compute-1 sudo[63870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqrgwtvkufggsmyvmsodexgtmxnyhzya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420173.5738966-742-21392019942727/AnsiballZ_systemd.py'
Jan 26 09:36:13 compute-1 sudo[63870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:14 compute-1 python3.9[63872]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 09:36:14 compute-1 systemd[1]: Reloading OpenSSH server daemon...
Jan 26 09:36:14 compute-1 sshd[1006]: Received SIGHUP; restarting.
Jan 26 09:36:14 compute-1 sshd[1006]: Server listening on 0.0.0.0 port 22.
Jan 26 09:36:14 compute-1 sshd[1006]: Server listening on :: port 22.
Jan 26 09:36:14 compute-1 systemd[1]: Reloaded OpenSSH server daemon.
Jan 26 09:36:14 compute-1 sudo[63870]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:14 compute-1 sudo[64026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mawegmkavedrhpjsbpqkxltqxdfasbgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420174.480727-766-158015691724242/AnsiballZ_file.py'
Jan 26 09:36:14 compute-1 sudo[64026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:14 compute-1 python3.9[64028]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:36:15 compute-1 sudo[64026]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:15 compute-1 sudo[64178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euyhfkamxagykvmiiwpfgsiyanxsbvkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420175.2141292-790-142833297769001/AnsiballZ_stat.py'
Jan 26 09:36:15 compute-1 sudo[64178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:15 compute-1 python3.9[64180]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:36:15 compute-1 sudo[64178]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:16 compute-1 sudo[64301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnilksgjoaropuqbfjjdfaiskrofxcmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420175.2141292-790-142833297769001/AnsiballZ_copy.py'
Jan 26 09:36:16 compute-1 sudo[64301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:16 compute-1 python3.9[64303]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420175.2141292-790-142833297769001/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:36:16 compute-1 sudo[64301]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:17 compute-1 sudo[64453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofawtvdsgtrggbipwotdqimwbxwoyddl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420176.7825315-844-256408487275724/AnsiballZ_timezone.py'
Jan 26 09:36:17 compute-1 sudo[64453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:17 compute-1 python3.9[64455]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 26 09:36:17 compute-1 systemd[1]: Starting Time & Date Service...
Jan 26 09:36:17 compute-1 systemd[1]: Started Time & Date Service.
Jan 26 09:36:17 compute-1 sudo[64453]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:18 compute-1 sudo[64609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahkocoqkpyujzlbbipkshrivhqkdcjtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420177.8698032-871-24396599042975/AnsiballZ_file.py'
Jan 26 09:36:18 compute-1 sudo[64609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:18 compute-1 python3.9[64611]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:36:18 compute-1 sudo[64609]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:18 compute-1 sudo[64761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueikdnetfblnrsariimgctmuyurvprwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420178.5664887-895-9522122889112/AnsiballZ_stat.py'
Jan 26 09:36:18 compute-1 sudo[64761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:19 compute-1 python3.9[64763]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:36:19 compute-1 sudo[64761]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:19 compute-1 sudo[64884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddgaomvjbpfibnwzetcdemjeemdjzwjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420178.5664887-895-9522122889112/AnsiballZ_copy.py'
Jan 26 09:36:19 compute-1 sudo[64884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:19 compute-1 python3.9[64886]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769420178.5664887-895-9522122889112/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:36:19 compute-1 sudo[64884]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:20 compute-1 sudo[65036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlpzgghvnhfwrebwqnxuxvqwjyzigjyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420179.8963096-940-20143349708409/AnsiballZ_stat.py'
Jan 26 09:36:20 compute-1 sudo[65036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:20 compute-1 python3.9[65038]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:36:20 compute-1 sudo[65036]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:20 compute-1 sudo[65159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcxpuydwaildhdwrnrsbdyzgjntxzhxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420179.8963096-940-20143349708409/AnsiballZ_copy.py'
Jan 26 09:36:20 compute-1 sudo[65159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:20 compute-1 python3.9[65161]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769420179.8963096-940-20143349708409/.source.yaml _original_basename=.ihq_hjjm follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:36:20 compute-1 sudo[65159]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:21 compute-1 sudo[65311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvdfycenlvkhttbaczchnudflsmhzxjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420181.138754-985-102414540591828/AnsiballZ_stat.py'
Jan 26 09:36:21 compute-1 sudo[65311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:21 compute-1 python3.9[65313]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:36:21 compute-1 sudo[65311]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:21 compute-1 sudo[65434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxlylikgfxcwvigpqhcmkwngvkmmrumi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420181.138754-985-102414540591828/AnsiballZ_copy.py'
Jan 26 09:36:21 compute-1 sudo[65434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:22 compute-1 python3.9[65436]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420181.138754-985-102414540591828/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:36:22 compute-1 sudo[65434]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:22 compute-1 sudo[65586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnnkqzlmufjdnqeepxzzaivvzwejjbbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420182.3233633-1030-145160692351969/AnsiballZ_command.py'
Jan 26 09:36:22 compute-1 sudo[65586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:22 compute-1 python3.9[65588]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:36:22 compute-1 sudo[65586]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:23 compute-1 sudo[65739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxjgrchwwjggmxjiwhrhqyaujesvkamv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420183.103908-1054-157407404564485/AnsiballZ_command.py'
Jan 26 09:36:23 compute-1 sudo[65739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:23 compute-1 python3.9[65741]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:36:23 compute-1 sudo[65739]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:24 compute-1 sudo[65892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzldmfcjhgatulultztsucqmwbfbcjey ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769420183.7674491-1078-161924151060295/AnsiballZ_edpm_nftables_from_files.py'
Jan 26 09:36:24 compute-1 sudo[65892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:24 compute-1 python3[65894]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 26 09:36:24 compute-1 sudo[65892]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:24 compute-1 sudo[66044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctxpdfhjvjrchupzmwdmmysrtmcwsgfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420184.518629-1102-191018156343412/AnsiballZ_stat.py'
Jan 26 09:36:24 compute-1 sudo[66044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:25 compute-1 python3.9[66046]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:36:25 compute-1 sudo[66044]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:25 compute-1 sudo[66167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vudmciizkynvusdgotaouzbqbvdklohk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420184.518629-1102-191018156343412/AnsiballZ_copy.py'
Jan 26 09:36:25 compute-1 sudo[66167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:25 compute-1 python3.9[66169]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420184.518629-1102-191018156343412/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:36:25 compute-1 sudo[66167]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:26 compute-1 sudo[66319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoxxmtkhmuhigkkzhpftbejjpzkdgglp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420185.8124213-1147-74818182575118/AnsiballZ_stat.py'
Jan 26 09:36:26 compute-1 sudo[66319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:26 compute-1 python3.9[66321]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:36:26 compute-1 sudo[66319]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:26 compute-1 sudo[66442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzspxnbtaduilrizprmrtauqfflzphhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420185.8124213-1147-74818182575118/AnsiballZ_copy.py'
Jan 26 09:36:26 compute-1 sudo[66442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:26 compute-1 python3.9[66444]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420185.8124213-1147-74818182575118/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:36:26 compute-1 sudo[66442]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:27 compute-1 sudo[66594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysfgyeywqlwwcyurywpzsaeegcqxyvrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420187.2821174-1192-26973545174162/AnsiballZ_stat.py'
Jan 26 09:36:27 compute-1 sudo[66594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:27 compute-1 python3.9[66596]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:36:27 compute-1 sudo[66594]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:28 compute-1 sudo[66717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htafkyczlgmrxesimhvgbnzfpxywhycg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420187.2821174-1192-26973545174162/AnsiballZ_copy.py'
Jan 26 09:36:28 compute-1 sudo[66717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:28 compute-1 python3.9[66719]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420187.2821174-1192-26973545174162/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:36:28 compute-1 sudo[66717]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:28 compute-1 sudo[66869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkboxqupflfjzkkdonhynvsdbbxvotcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420188.5483458-1237-26728761888648/AnsiballZ_stat.py'
Jan 26 09:36:28 compute-1 sudo[66869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:28 compute-1 python3.9[66871]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:36:29 compute-1 sudo[66869]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:29 compute-1 sudo[66992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csdzsafnreqwneadvmvnaltfwxikzsfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420188.5483458-1237-26728761888648/AnsiballZ_copy.py'
Jan 26 09:36:29 compute-1 sudo[66992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:29 compute-1 python3.9[66994]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420188.5483458-1237-26728761888648/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:36:29 compute-1 sudo[66992]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:30 compute-1 sudo[67144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpsqtxbtjylanevwyulfwoiscyimxvpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420189.8314896-1282-199942946282213/AnsiballZ_stat.py'
Jan 26 09:36:30 compute-1 sudo[67144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:30 compute-1 python3.9[67146]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:36:30 compute-1 sudo[67144]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:30 compute-1 sudo[67267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exisgdspuupimdrcoiwtmzgavqpjktjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420189.8314896-1282-199942946282213/AnsiballZ_copy.py'
Jan 26 09:36:30 compute-1 sudo[67267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:30 compute-1 python3.9[67269]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420189.8314896-1282-199942946282213/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:36:30 compute-1 sudo[67267]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:31 compute-1 sudo[67419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qajmpwkzqcigkfpbnawxywcqtogpfwdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420191.2569053-1327-144317001572788/AnsiballZ_file.py'
Jan 26 09:36:31 compute-1 sudo[67419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:31 compute-1 python3.9[67421]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:36:31 compute-1 sudo[67419]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:32 compute-1 sudo[67571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmifrjudnvajpktdqppjjuedogeylaqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420191.928623-1351-219038627341111/AnsiballZ_command.py'
Jan 26 09:36:32 compute-1 sudo[67571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:32 compute-1 python3.9[67573]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:36:32 compute-1 sudo[67571]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:33 compute-1 sudo[67730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dndscwgpzqzewadqnyazvpunegxuobpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420192.6611388-1376-182177724570718/AnsiballZ_blockinfile.py'
Jan 26 09:36:33 compute-1 sudo[67730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:33 compute-1 python3.9[67732]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:36:33 compute-1 sudo[67730]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:33 compute-1 sudo[67883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksuxtkngdjjxpzmpitpqjekrcwefctco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420193.6383772-1402-100718941139290/AnsiballZ_file.py'
Jan 26 09:36:33 compute-1 sudo[67883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:34 compute-1 python3.9[67885]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:36:34 compute-1 sudo[67883]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:34 compute-1 sudo[68035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylrhrrvfbnixingydvhwswwmrnqtpfyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420194.255142-1402-66800408068465/AnsiballZ_file.py'
Jan 26 09:36:34 compute-1 sudo[68035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:34 compute-1 python3.9[68037]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:36:35 compute-1 sudo[68035]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:35 compute-1 sudo[68187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwuzrklylwvkvoozsbcwrsmfeitmquvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420195.3660524-1447-117267406000175/AnsiballZ_mount.py'
Jan 26 09:36:35 compute-1 sudo[68187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:36 compute-1 python3.9[68189]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 26 09:36:36 compute-1 sudo[68187]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:36 compute-1 sudo[68340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvfbygrbkpdkbbeavqblrwybgfjncvlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420196.1954706-1447-47772906482622/AnsiballZ_mount.py'
Jan 26 09:36:36 compute-1 sudo[68340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:36 compute-1 python3.9[68342]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 26 09:36:36 compute-1 sudo[68340]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:37 compute-1 sshd-session[59137]: Connection closed by 192.168.122.30 port 58126
Jan 26 09:36:37 compute-1 sshd-session[59134]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:37 compute-1 systemd[1]: session-15.scope: Deactivated successfully.
Jan 26 09:36:37 compute-1 systemd[1]: session-15.scope: Consumed 34.654s CPU time.
Jan 26 09:36:37 compute-1 systemd-logind[783]: Session 15 logged out. Waiting for processes to exit.
Jan 26 09:36:37 compute-1 systemd-logind[783]: Removed session 15.
Jan 26 09:36:43 compute-1 sshd-session[68368]: Accepted publickey for zuul from 192.168.122.30 port 51406 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:36:43 compute-1 systemd-logind[783]: New session 16 of user zuul.
Jan 26 09:36:43 compute-1 systemd[1]: Started Session 16 of User zuul.
Jan 26 09:36:43 compute-1 sshd-session[68368]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:36:43 compute-1 sudo[68521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffuhnwskpxilmhhrznuayqbuvjjdonku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420203.1656132-19-225035773409333/AnsiballZ_tempfile.py'
Jan 26 09:36:43 compute-1 sudo[68521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:44 compute-1 python3.9[68523]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 26 09:36:44 compute-1 sudo[68521]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:45 compute-1 sudo[68673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrncfnpzijshebauuslyezhydaclxptf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420205.0752056-55-276763006072351/AnsiballZ_stat.py'
Jan 26 09:36:45 compute-1 sudo[68673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:45 compute-1 python3.9[68675]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:36:45 compute-1 sudo[68673]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:46 compute-1 sudo[68825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhbbgtxjedeoncbagpzklyipdpoefycm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420205.9326313-85-16127066231025/AnsiballZ_setup.py'
Jan 26 09:36:46 compute-1 sudo[68825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:46 compute-1 python3.9[68827]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:36:46 compute-1 sudo[68825]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:47 compute-1 sudo[68977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knonkycuegmrwgugoniymxxqslaitset ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420207.030138-110-212337973886733/AnsiballZ_blockinfile.py'
Jan 26 09:36:47 compute-1 sudo[68977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:47 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 26 09:36:47 compute-1 python3.9[68979]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDm+Vrn31pimz+Of4pkRaSS+qazCMrOF2INZ0EZsyoNG5922K2xwdC9F6r4k2L54HPEpDiazPoDsOHQvs1I+CvayNM2D+8hZhvqxZOMimP8b056aM14nht9ADrJUnlaDs57FkgIKQdxma9I0sW8Up3bbLchFOj2grOjH7gRdUBxblzIS01/P5NV8/kPsRXDoCgx+QAxU2nEqyCQd0JXLKoy+v6t+pG7We9wFXXr2z4XmAx7yeU0Y6NsJ1Seies0apLTmfK3HAtj/3LObvZegqVGDFtl5spotTmJdPJUCZhniaUmyYZ4jtIEno86Bf8OhS3NvLsxmNXuJcInlmCHGXDP9FPBrxG+yVB63FUAeyejCXntEyOzXFp8fiCuOVQuqDTWB4UxTRYh3EqVruxhY1taarew/VfsxIAxv6BWsqtvh/6xtRtJ9vTSDHsDTRaOcChfT5BnATFJ+Ilwpve8C4bjRVdlStH+99TgtNPOg2Fxf8scyIHInM9c4Yn7g8YTiyk=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICrJdFptF1rp2hjeKcc0nSEhHvDtAYFU4gfqZN6U+WTb
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNa2lKVjuYCljd0rl1qDkTP3ZoTV9fkbcXvtxSizwygrF6dU+RWdeB3LOkT5U/2GTJuWvOqxJBc3Y1d0b3Dj5Do=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCyi0WEBS9Gc5Xay4vqFSdv0cJGdtezg+CrNF/vjEeF3l4EhpAAj7XRLEhEU1kz0DDKkzclG65hBNPO4/9cfzEa31EsSmzOqjqZp5ri20HVDkiZlUTTklhrbJGydUw6mcy+rIN1qsUugVHwkA9ufZLvzm9wvljzL+WPt1o41GT42NdNzyfPfnqf7HMDziNUNUUZjqsoy+DQnlMl3c3NHiGysPJ6IssbLBCFzPdBHpEYmR8b44qlJEhx3RYWl3QLcXAyoK7VpPdFO4ltMT+0KVVbLO9IUrocCQ4HfafPn/mV1Rq3phDWvCTRfRo07Mu4Oc4XBu+RIk9tt1WTIdT/ZusPUNSkFgprdU9zFIHLR0KyIX4qRSuWBeB20Ic5pvkRvNtwLB8lPt4NVi7bmun6moO8nu6cOjJ61CCAobDSEL/Z2cG3ADucjCSKtWLM0eSdt6T71NmULMhdB8ljIK4em/NCf/qZWjYr70WKyIZ9b8N5lDO8NF1tbPJyu+O0ebq/JN8=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAaib//yQ1QyvWijjfui4OBtTtMt7Dos+hlx8rucs2Tn
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBP7YXsQWyEQWSdy5tcEAtltn11CwuaqW/S8S3OB1580hTlcLZWLPDHbzSwNDf13HBG9wgLFgmueLB8U6J7wvvcM=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC0TZpcPGqQPKNdLKsJSWd1uRV3wOVDiIo3gYwVWAuH5m+Wvpw34ZI+6+d4y3DWMqDRZVWAVV0NNFB+b4MQeivx4S7KMCvBctzJ6VIyUDL5NZrwys0sYPH+33ncdZd6C8LrfCvIct+DbWCx72RQ+G0yRbYK1r/m5+dzW2411NqWn8kJkBUeLJIqT2vhFoNpO8NaWSVlWEgl5YunYEPS4v5NSM88ke6Gzc5X5sjxsz65REj6/1BXsA+quwcTAe/KC1/1Rr2cufefwf0uayM6sGuUDATjWIw36YqUeL9wc/IDdIEFEvj2hr/v+r6laaKMidOYJXBiQwIWpgWCOosSj4vrPQmDfqjOa8sAn7yWPVgxyARccavEO89zV2lpFcYTdqegPxjB90lD3Q1pMU6veJUWTRo0LAZ6n9rsRBgF0Mhr75T32Lbqf3KBro6/nPrp1XCD08mNv2cEYwp+put7vwvHzN1nPztqMsIDAMJMupwI+Buyr3xCPHe3hcAavahF+YM=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAINbUUMKlV4hksqDn2YVVAHPCHip80h7zj0rReM94Ja2l
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFtD30BOt1BlR6BYm8DU7sxF5fAzZ/aciKetiRsXWlbsXS3Z4mVG1ZAF9AhArV+OaapsLeaQFybIC0e2fudJfos=
                                             create=True mode=0644 path=/tmp/ansible.5wv34se6 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:36:47 compute-1 sudo[68977]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:48 compute-1 sudo[69131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxoudseooxuvmbrzhjafynxpzcvvkkri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420207.8429418-134-270029996738966/AnsiballZ_command.py'
Jan 26 09:36:48 compute-1 sudo[69131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:48 compute-1 python3.9[69133]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.5wv34se6' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:36:48 compute-1 sudo[69131]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:49 compute-1 sudo[69285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-perpteunenhouhhucpbkoihtyayubtmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420208.672528-158-54691232966635/AnsiballZ_file.py'
Jan 26 09:36:49 compute-1 sudo[69285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:49 compute-1 python3.9[69287]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.5wv34se6 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:36:49 compute-1 sudo[69285]: pam_unix(sudo:session): session closed for user root
Jan 26 09:36:49 compute-1 sshd-session[68371]: Connection closed by 192.168.122.30 port 51406
Jan 26 09:36:49 compute-1 sshd-session[68368]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:49 compute-1 systemd[1]: session-16.scope: Deactivated successfully.
Jan 26 09:36:49 compute-1 systemd[1]: session-16.scope: Consumed 3.334s CPU time.
Jan 26 09:36:49 compute-1 systemd-logind[783]: Session 16 logged out. Waiting for processes to exit.
Jan 26 09:36:49 compute-1 systemd-logind[783]: Removed session 16.
Jan 26 09:36:50 compute-1 sshd-session[69312]: Connection closed by authenticating user root 152.42.136.110 port 54288 [preauth]
Jan 26 09:36:57 compute-1 sshd-session[69314]: Accepted publickey for zuul from 192.168.122.30 port 38902 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:36:57 compute-1 systemd-logind[783]: New session 17 of user zuul.
Jan 26 09:36:57 compute-1 systemd[1]: Started Session 17 of User zuul.
Jan 26 09:36:57 compute-1 sshd-session[69314]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:36:58 compute-1 python3.9[69469]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:36:59 compute-1 sshd-session[69370]: Connection closed by authenticating user root 178.62.249.31 port 56354 [preauth]
Jan 26 09:36:59 compute-1 sudo[69623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giueyqanlzgytplqnxdklagpitjlolcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420218.8113692-52-43742102932131/AnsiballZ_systemd.py'
Jan 26 09:36:59 compute-1 sudo[69623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:36:59 compute-1 python3.9[69625]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 26 09:36:59 compute-1 sudo[69623]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:00 compute-1 sudo[69777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qieouxunkyjtkbiwytguargqljfdipzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420219.9824023-76-240797303999391/AnsiballZ_systemd.py'
Jan 26 09:37:00 compute-1 sudo[69777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:00 compute-1 python3.9[69779]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 09:37:00 compute-1 sudo[69777]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:01 compute-1 sudo[69932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahvorkcwwccxhclembikukzlfminsgtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420220.8342826-103-2350960315531/AnsiballZ_command.py'
Jan 26 09:37:01 compute-1 sudo[69932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:01 compute-1 sshd-session[69857]: Connection closed by authenticating user root 104.248.198.71 port 45002 [preauth]
Jan 26 09:37:01 compute-1 python3.9[69934]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:37:01 compute-1 sudo[69932]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:02 compute-1 sudo[70085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zitzkkvhqypeaukouecozpezpnqtewpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420221.6275637-127-211427555317698/AnsiballZ_stat.py'
Jan 26 09:37:02 compute-1 sudo[70085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:02 compute-1 python3.9[70087]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:37:02 compute-1 sudo[70085]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:02 compute-1 sudo[70239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxkuprmprdplzskflllhqtmdtbtenjuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420222.4770112-151-103353495272041/AnsiballZ_command.py'
Jan 26 09:37:02 compute-1 sudo[70239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:02 compute-1 python3.9[70241]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:37:02 compute-1 sudo[70239]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:03 compute-1 sudo[70394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hoxnrftqsfxbdcziswzplidvtjlgwswi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420223.1529038-175-69054495047677/AnsiballZ_file.py'
Jan 26 09:37:03 compute-1 sudo[70394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:03 compute-1 python3.9[70396]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:37:03 compute-1 sudo[70394]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:04 compute-1 sshd-session[69317]: Connection closed by 192.168.122.30 port 38902
Jan 26 09:37:04 compute-1 sshd-session[69314]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:37:04 compute-1 systemd[1]: session-17.scope: Deactivated successfully.
Jan 26 09:37:04 compute-1 systemd[1]: session-17.scope: Consumed 4.472s CPU time.
Jan 26 09:37:04 compute-1 systemd-logind[783]: Session 17 logged out. Waiting for processes to exit.
Jan 26 09:37:04 compute-1 systemd-logind[783]: Removed session 17.
Jan 26 09:37:09 compute-1 sshd-session[70422]: Accepted publickey for zuul from 192.168.122.30 port 36118 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:37:09 compute-1 systemd-logind[783]: New session 18 of user zuul.
Jan 26 09:37:09 compute-1 systemd[1]: Started Session 18 of User zuul.
Jan 26 09:37:09 compute-1 sshd-session[70422]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:37:10 compute-1 python3.9[70575]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:37:11 compute-1 sudo[70729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytjlfgchjjkuwmuctzjpavhcvzkfbknr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420231.0773299-58-178329251461527/AnsiballZ_setup.py'
Jan 26 09:37:11 compute-1 sudo[70729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:11 compute-1 python3.9[70731]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 09:37:11 compute-1 sudo[70729]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:12 compute-1 sudo[70813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nshotcsrqkabxaemgohmldvajlhvgbny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420231.0773299-58-178329251461527/AnsiballZ_dnf.py'
Jan 26 09:37:12 compute-1 sudo[70813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:12 compute-1 python3.9[70815]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 09:37:13 compute-1 sudo[70813]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:14 compute-1 python3.9[70966]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:37:15 compute-1 python3.9[71117]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 09:37:16 compute-1 python3.9[71267]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:37:16 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 09:37:16 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 09:37:17 compute-1 python3.9[71418]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:37:17 compute-1 sshd-session[70425]: Connection closed by 192.168.122.30 port 36118
Jan 26 09:37:17 compute-1 sshd-session[70422]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:37:17 compute-1 systemd[1]: session-18.scope: Deactivated successfully.
Jan 26 09:37:17 compute-1 systemd[1]: session-18.scope: Consumed 5.779s CPU time.
Jan 26 09:37:17 compute-1 systemd-logind[783]: Session 18 logged out. Waiting for processes to exit.
Jan 26 09:37:17 compute-1 systemd-logind[783]: Removed session 18.
Jan 26 09:37:25 compute-1 sshd-session[71443]: Accepted publickey for zuul from 38.102.83.222 port 48076 ssh2: RSA SHA256:pzGu/8MlhtIDRxsRqlS4AZ6R7CLTQo7Ke10EmY50Qfo
Jan 26 09:37:25 compute-1 systemd-logind[783]: New session 19 of user zuul.
Jan 26 09:37:25 compute-1 systemd[1]: Started Session 19 of User zuul.
Jan 26 09:37:25 compute-1 sshd-session[71443]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:37:26 compute-1 sudo[71519]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uavhwubpidukgajenglimvocskvquttc ; /usr/bin/python3'
Jan 26 09:37:26 compute-1 sudo[71519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:26 compute-1 useradd[71523]: new group: name=ceph-admin, GID=42478
Jan 26 09:37:26 compute-1 useradd[71523]: new user: name=ceph-admin, UID=42477, GID=42478, home=/home/ceph-admin, shell=/bin/bash, from=none
Jan 26 09:37:26 compute-1 sudo[71519]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:26 compute-1 sudo[71605]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qykblwqatcmvbcrekxfpxapakbkgytxo ; /usr/bin/python3'
Jan 26 09:37:26 compute-1 sudo[71605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:26 compute-1 sudo[71605]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:27 compute-1 sudo[71678]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uicmjgnnnnnrpkwoocvuwgjojhdidspf ; /usr/bin/python3'
Jan 26 09:37:27 compute-1 sudo[71678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:27 compute-1 sudo[71678]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:27 compute-1 sudo[71728]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yaejmqwdlzwfbckyiyudmdniqfdsccez ; /usr/bin/python3'
Jan 26 09:37:27 compute-1 sudo[71728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:27 compute-1 sudo[71728]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:28 compute-1 sudo[71754]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhcjywymmzfwsedqzjspgubxycqmnsyn ; /usr/bin/python3'
Jan 26 09:37:28 compute-1 sudo[71754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:28 compute-1 sudo[71754]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:28 compute-1 sudo[71780]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrezwqpqcuipopmnvpeqxqtyosgptyjz ; /usr/bin/python3'
Jan 26 09:37:28 compute-1 sudo[71780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:28 compute-1 sudo[71780]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:29 compute-1 sudo[71806]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fymiozacwyhqkrgdknhoeksusorrkrdk ; /usr/bin/python3'
Jan 26 09:37:29 compute-1 sudo[71806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:29 compute-1 sudo[71806]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:29 compute-1 sudo[71884]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mybqvdrfufggrhrscesxlxevefuntqcq ; /usr/bin/python3'
Jan 26 09:37:29 compute-1 sudo[71884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:29 compute-1 sudo[71884]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:29 compute-1 sudo[71957]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmikueotrjscafmnosircoktvdpbhmlm ; /usr/bin/python3'
Jan 26 09:37:29 compute-1 sudo[71957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:30 compute-1 sudo[71957]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:30 compute-1 sudo[72059]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afterltjzkngutdyadmuazuygjblqdcv ; /usr/bin/python3'
Jan 26 09:37:30 compute-1 sudo[72059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:30 compute-1 sudo[72059]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:30 compute-1 sudo[72132]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aydmttflfoasnmdgpcnbnuwovngnvwss ; /usr/bin/python3'
Jan 26 09:37:30 compute-1 sudo[72132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:31 compute-1 sudo[72132]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:31 compute-1 sudo[72182]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmphczabhirnhiaypeubxbwkhpaavxmc ; /usr/bin/python3'
Jan 26 09:37:31 compute-1 sudo[72182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:31 compute-1 python3[72184]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:37:32 compute-1 sudo[72182]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:33 compute-1 sudo[72277]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpxelhmtquuzaijfzexmdnujuoxrzogh ; /usr/bin/python3'
Jan 26 09:37:33 compute-1 sudo[72277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:33 compute-1 python3[72279]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 26 09:37:34 compute-1 sudo[72277]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:34 compute-1 sudo[72305]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fscllfcevdjqsqbntwsfwsoncxujlltr ; /usr/bin/python3'
Jan 26 09:37:34 compute-1 sudo[72305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:35 compute-1 python3[72307]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 26 09:37:35 compute-1 sudo[72305]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:35 compute-1 sudo[72332]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdfedvqusklgchqxayrsiadruccelexn ; /usr/bin/python3'
Jan 26 09:37:35 compute-1 sudo[72332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:35 compute-1 python3[72334]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=20G
                                          losetup /dev/loop3 /var/lib/ceph-osd-0.img
                                          lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:37:35 compute-1 kernel: loop: module loaded
Jan 26 09:37:35 compute-1 kernel: loop3: detected capacity change from 0 to 41943040
Jan 26 09:37:35 compute-1 sudo[72332]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:35 compute-1 sudo[72367]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzcdmlkiaowdyzxpxmpbeixkbxuurspg ; /usr/bin/python3'
Jan 26 09:37:35 compute-1 sudo[72367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:35 compute-1 python3[72369]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3
                                          vgcreate ceph_vg0 /dev/loop3
                                          lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0
                                          lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:37:35 compute-1 sshd-session[72281]: Connection closed by authenticating user root 152.42.136.110 port 39428 [preauth]
Jan 26 09:37:36 compute-1 lvm[72372]: PV /dev/loop3 not used.
Jan 26 09:37:36 compute-1 lvm[72374]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 26 09:37:36 compute-1 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Jan 26 09:37:36 compute-1 lvm[72382]:   1 logical volume(s) in volume group "ceph_vg0" now active
Jan 26 09:37:36 compute-1 lvm[72384]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 26 09:37:36 compute-1 lvm[72384]: VG ceph_vg0 finished
Jan 26 09:37:36 compute-1 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Jan 26 09:37:36 compute-1 sudo[72367]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:36 compute-1 sudo[72460]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afbuerxtretwambccuelxibungkzzgdb ; /usr/bin/python3'
Jan 26 09:37:36 compute-1 sudo[72460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:36 compute-1 python3[72462]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 09:37:36 compute-1 sudo[72460]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:36 compute-1 sudo[72533]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynohmdcnrcokkcjatsfxupvsrmicwgcd ; /usr/bin/python3'
Jan 26 09:37:36 compute-1 sudo[72533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:37 compute-1 python3[72535]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769420256.3965585-36916-47310311552383/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:37:37 compute-1 sudo[72533]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:37 compute-1 sudo[72583]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsrwadantneszjtlhdlzqrydmudkbvjv ; /usr/bin/python3'
Jan 26 09:37:37 compute-1 sudo[72583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:37 compute-1 python3[72585]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:37:37 compute-1 systemd[1]: Reloading.
Jan 26 09:37:37 compute-1 systemd-rc-local-generator[72614]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:37:37 compute-1 systemd-sysv-generator[72617]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:37:38 compute-1 systemd[1]: Starting Ceph OSD losetup...
Jan 26 09:37:38 compute-1 bash[72624]: /dev/loop3: [64513]:4328448 (/var/lib/ceph-osd-0.img)
Jan 26 09:37:38 compute-1 systemd[1]: Finished Ceph OSD losetup.
Jan 26 09:37:38 compute-1 lvm[72625]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 26 09:37:38 compute-1 lvm[72625]: VG ceph_vg0 finished
Jan 26 09:37:38 compute-1 sudo[72583]: pam_unix(sudo:session): session closed for user root
Jan 26 09:37:39 compute-1 chronyd[58653]: Selected source 167.160.187.12 (pool.ntp.org)
Jan 26 09:37:40 compute-1 python3[72650]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:37:46 compute-1 sshd-session[72694]: Connection closed by authenticating user root 178.62.249.31 port 48566 [preauth]
Jan 26 09:37:58 compute-1 sshd-session[72696]: Invalid user admin from 104.248.198.71 port 44898
Jan 26 09:37:58 compute-1 sshd-session[72696]: Connection closed by invalid user admin 104.248.198.71 port 44898 [preauth]
Jan 26 09:38:20 compute-1 sshd-session[72698]: Connection closed by authenticating user root 152.42.136.110 port 50294 [preauth]
Jan 26 09:38:33 compute-1 sshd-session[72700]: Connection closed by authenticating user root 178.62.249.31 port 50368 [preauth]
Jan 26 09:38:53 compute-1 sshd-session[72703]: Invalid user admin from 104.248.198.71 port 47830
Jan 26 09:38:53 compute-1 sshd-session[72703]: Connection closed by invalid user admin 104.248.198.71 port 47830 [preauth]
Jan 26 09:39:01 compute-1 anacron[4199]: Job `cron.weekly' started
Jan 26 09:39:01 compute-1 anacron[4199]: Job `cron.weekly' terminated
Jan 26 09:39:05 compute-1 sshd-session[72707]: Connection closed by authenticating user root 152.42.136.110 port 59918 [preauth]
Jan 26 09:39:13 compute-1 sshd-session[72709]: Accepted publickey for ceph-admin from 192.168.122.100 port 33812 ssh2: RSA SHA256:cGz1g5qmzBfeiAiDRElnaAonZh1cdMIZMAXyGkEzbws
Jan 26 09:39:13 compute-1 systemd[1]: Created slice User Slice of UID 42477.
Jan 26 09:39:13 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42477...
Jan 26 09:39:13 compute-1 systemd-logind[783]: New session 20 of user ceph-admin.
Jan 26 09:39:13 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42477.
Jan 26 09:39:13 compute-1 systemd[1]: Starting User Manager for UID 42477...
Jan 26 09:39:13 compute-1 systemd[72713]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 26 09:39:13 compute-1 systemd[72713]: Queued start job for default target Main User Target.
Jan 26 09:39:13 compute-1 systemd[72713]: Created slice User Application Slice.
Jan 26 09:39:13 compute-1 systemd[72713]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 26 09:39:13 compute-1 systemd[72713]: Started Daily Cleanup of User's Temporary Directories.
Jan 26 09:39:13 compute-1 systemd[72713]: Reached target Paths.
Jan 26 09:39:13 compute-1 systemd[72713]: Reached target Timers.
Jan 26 09:39:13 compute-1 systemd[72713]: Starting D-Bus User Message Bus Socket...
Jan 26 09:39:13 compute-1 systemd[72713]: Starting Create User's Volatile Files and Directories...
Jan 26 09:39:13 compute-1 systemd[72713]: Finished Create User's Volatile Files and Directories.
Jan 26 09:39:13 compute-1 systemd[72713]: Listening on D-Bus User Message Bus Socket.
Jan 26 09:39:13 compute-1 systemd[72713]: Reached target Sockets.
Jan 26 09:39:13 compute-1 systemd[72713]: Reached target Basic System.
Jan 26 09:39:13 compute-1 systemd[72713]: Reached target Main User Target.
Jan 26 09:39:13 compute-1 systemd[72713]: Startup finished in 116ms.
Jan 26 09:39:13 compute-1 sshd-session[72726]: Accepted publickey for ceph-admin from 192.168.122.100 port 33820 ssh2: RSA SHA256:cGz1g5qmzBfeiAiDRElnaAonZh1cdMIZMAXyGkEzbws
Jan 26 09:39:13 compute-1 systemd[1]: Started User Manager for UID 42477.
Jan 26 09:39:13 compute-1 systemd[1]: Started Session 20 of User ceph-admin.
Jan 26 09:39:13 compute-1 sshd-session[72709]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 26 09:39:13 compute-1 systemd-logind[783]: New session 22 of user ceph-admin.
Jan 26 09:39:13 compute-1 systemd[1]: Started Session 22 of User ceph-admin.
Jan 26 09:39:13 compute-1 sshd-session[72726]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 26 09:39:13 compute-1 sudo[72733]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:39:13 compute-1 sudo[72733]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:39:13 compute-1 sudo[72733]: pam_unix(sudo:session): session closed for user root
Jan 26 09:39:13 compute-1 sshd-session[72758]: Accepted publickey for ceph-admin from 192.168.122.100 port 33822 ssh2: RSA SHA256:cGz1g5qmzBfeiAiDRElnaAonZh1cdMIZMAXyGkEzbws
Jan 26 09:39:13 compute-1 systemd-logind[783]: New session 23 of user ceph-admin.
Jan 26 09:39:13 compute-1 systemd[1]: Started Session 23 of User ceph-admin.
Jan 26 09:39:13 compute-1 sshd-session[72758]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 26 09:39:13 compute-1 sudo[72762]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host --expect-hostname compute-1
Jan 26 09:39:13 compute-1 sudo[72762]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:39:13 compute-1 sudo[72762]: pam_unix(sudo:session): session closed for user root
Jan 26 09:39:14 compute-1 sshd-session[72787]: Accepted publickey for ceph-admin from 192.168.122.100 port 33826 ssh2: RSA SHA256:cGz1g5qmzBfeiAiDRElnaAonZh1cdMIZMAXyGkEzbws
Jan 26 09:39:14 compute-1 systemd-logind[783]: New session 24 of user ceph-admin.
Jan 26 09:39:14 compute-1 systemd[1]: Started Session 24 of User ceph-admin.
Jan 26 09:39:14 compute-1 sshd-session[72787]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 26 09:39:14 compute-1 sudo[72791]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36
Jan 26 09:39:14 compute-1 sudo[72791]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:39:14 compute-1 sudo[72791]: pam_unix(sudo:session): session closed for user root
Jan 26 09:39:14 compute-1 sshd-session[72816]: Accepted publickey for ceph-admin from 192.168.122.100 port 33836 ssh2: RSA SHA256:cGz1g5qmzBfeiAiDRElnaAonZh1cdMIZMAXyGkEzbws
Jan 26 09:39:14 compute-1 systemd-logind[783]: New session 25 of user ceph-admin.
Jan 26 09:39:14 compute-1 systemd[1]: Started Session 25 of User ceph-admin.
Jan 26 09:39:14 compute-1 sshd-session[72816]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 26 09:39:14 compute-1 sudo[72820]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:39:14 compute-1 sudo[72820]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:39:14 compute-1 sudo[72820]: pam_unix(sudo:session): session closed for user root
Jan 26 09:39:14 compute-1 sshd-session[72845]: Accepted publickey for ceph-admin from 192.168.122.100 port 33842 ssh2: RSA SHA256:cGz1g5qmzBfeiAiDRElnaAonZh1cdMIZMAXyGkEzbws
Jan 26 09:39:14 compute-1 systemd-logind[783]: New session 26 of user ceph-admin.
Jan 26 09:39:14 compute-1 systemd[1]: Started Session 26 of User ceph-admin.
Jan 26 09:39:14 compute-1 sshd-session[72845]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 26 09:39:14 compute-1 sudo[72849]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:39:14 compute-1 sudo[72849]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:39:14 compute-1 sudo[72849]: pam_unix(sudo:session): session closed for user root
Jan 26 09:39:15 compute-1 sshd-session[72874]: Accepted publickey for ceph-admin from 192.168.122.100 port 33854 ssh2: RSA SHA256:cGz1g5qmzBfeiAiDRElnaAonZh1cdMIZMAXyGkEzbws
Jan 26 09:39:15 compute-1 systemd-logind[783]: New session 27 of user ceph-admin.
Jan 26 09:39:15 compute-1 systemd[1]: Started Session 27 of User ceph-admin.
Jan 26 09:39:15 compute-1 sshd-session[72874]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 26 09:39:15 compute-1 sudo[72878]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36.new
Jan 26 09:39:15 compute-1 sudo[72878]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:39:15 compute-1 sudo[72878]: pam_unix(sudo:session): session closed for user root
Jan 26 09:39:15 compute-1 sshd-session[72903]: Accepted publickey for ceph-admin from 192.168.122.100 port 33862 ssh2: RSA SHA256:cGz1g5qmzBfeiAiDRElnaAonZh1cdMIZMAXyGkEzbws
Jan 26 09:39:15 compute-1 systemd-logind[783]: New session 28 of user ceph-admin.
Jan 26 09:39:15 compute-1 systemd[1]: Started Session 28 of User ceph-admin.
Jan 26 09:39:15 compute-1 sshd-session[72903]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 26 09:39:15 compute-1 sudo[72907]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:39:15 compute-1 sudo[72907]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:39:15 compute-1 sudo[72907]: pam_unix(sudo:session): session closed for user root
Jan 26 09:39:15 compute-1 sshd-session[72932]: Accepted publickey for ceph-admin from 192.168.122.100 port 33866 ssh2: RSA SHA256:cGz1g5qmzBfeiAiDRElnaAonZh1cdMIZMAXyGkEzbws
Jan 26 09:39:15 compute-1 systemd-logind[783]: New session 29 of user ceph-admin.
Jan 26 09:39:15 compute-1 systemd[1]: Started Session 29 of User ceph-admin.
Jan 26 09:39:15 compute-1 sshd-session[72932]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 26 09:39:15 compute-1 sudo[72936]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36.new
Jan 26 09:39:15 compute-1 sudo[72936]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:39:15 compute-1 sudo[72936]: pam_unix(sudo:session): session closed for user root
Jan 26 09:39:16 compute-1 sshd-session[72961]: Accepted publickey for ceph-admin from 192.168.122.100 port 33868 ssh2: RSA SHA256:cGz1g5qmzBfeiAiDRElnaAonZh1cdMIZMAXyGkEzbws
Jan 26 09:39:16 compute-1 systemd-logind[783]: New session 30 of user ceph-admin.
Jan 26 09:39:16 compute-1 systemd[1]: Started Session 30 of User ceph-admin.
Jan 26 09:39:16 compute-1 sshd-session[72961]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 26 09:39:17 compute-1 sshd-session[72988]: Accepted publickey for ceph-admin from 192.168.122.100 port 33880 ssh2: RSA SHA256:cGz1g5qmzBfeiAiDRElnaAonZh1cdMIZMAXyGkEzbws
Jan 26 09:39:17 compute-1 systemd-logind[783]: New session 31 of user ceph-admin.
Jan 26 09:39:17 compute-1 systemd[1]: Started Session 31 of User ceph-admin.
Jan 26 09:39:17 compute-1 sshd-session[72988]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 26 09:39:17 compute-1 sudo[72992]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36.new /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36
Jan 26 09:39:17 compute-1 sudo[72992]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:39:17 compute-1 sudo[72992]: pam_unix(sudo:session): session closed for user root
Jan 26 09:39:17 compute-1 sshd-session[73017]: Accepted publickey for ceph-admin from 192.168.122.100 port 33886 ssh2: RSA SHA256:cGz1g5qmzBfeiAiDRElnaAonZh1cdMIZMAXyGkEzbws
Jan 26 09:39:17 compute-1 systemd-logind[783]: New session 32 of user ceph-admin.
Jan 26 09:39:17 compute-1 systemd[1]: Started Session 32 of User ceph-admin.
Jan 26 09:39:17 compute-1 sshd-session[73017]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 26 09:39:17 compute-1 sudo[73021]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host --expect-hostname compute-1
Jan 26 09:39:17 compute-1 sudo[73021]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:39:18 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 09:39:18 compute-1 sudo[73021]: pam_unix(sudo:session): session closed for user root
Jan 26 09:39:18 compute-1 sudo[73065]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:39:18 compute-1 sudo[73065]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:39:18 compute-1 sudo[73065]: pam_unix(sudo:session): session closed for user root
Jan 26 09:39:18 compute-1 sudo[73090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Jan 26 09:39:18 compute-1 sudo[73090]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:39:18 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 09:39:18 compute-1 sudo[73090]: pam_unix(sudo:session): session closed for user root
Jan 26 09:39:18 compute-1 sudo[73135]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:39:18 compute-1 sudo[73135]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:39:18 compute-1 sudo[73135]: pam_unix(sudo:session): session closed for user root
Jan 26 09:39:18 compute-1 sudo[73160]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 26 09:39:18 compute-1 sudo[73160]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:39:19 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 09:39:19 compute-1 sudo[73160]: pam_unix(sudo:session): session closed for user root
Jan 26 09:39:19 compute-1 sudo[73221]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:39:19 compute-1 sudo[73221]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:39:19 compute-1 sudo[73221]: pam_unix(sudo:session): session closed for user root
Jan 26 09:39:19 compute-1 sudo[73247]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 09:39:19 compute-1 sudo[73247]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:39:19 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 09:39:19 compute-1 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73285 (sysctl)
Jan 26 09:39:19 compute-1 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 26 09:39:19 compute-1 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 26 09:39:19 compute-1 sudo[73247]: pam_unix(sudo:session): session closed for user root
Jan 26 09:39:19 compute-1 sudo[73307]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:39:19 compute-1 sudo[73307]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:39:19 compute-1 sudo[73307]: pam_unix(sudo:session): session closed for user root
Jan 26 09:39:19 compute-1 sudo[73332]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Jan 26 09:39:19 compute-1 sudo[73332]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:39:20 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 09:39:20 compute-1 sudo[73332]: pam_unix(sudo:session): session closed for user root
Jan 26 09:39:20 compute-1 sudo[73375]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:39:20 compute-1 sudo[73375]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:39:20 compute-1 sudo[73375]: pam_unix(sudo:session): session closed for user root
Jan 26 09:39:20 compute-1 sudo[73400]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 1a70b85d-e3fd-5814-8a6a-37ea00fcae30 -- inventory --format=json-pretty --filter-for-batch
Jan 26 09:39:20 compute-1 sudo[73400]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:39:20 compute-1 sshd-session[73246]: Connection closed by authenticating user root 178.62.249.31 port 53344 [preauth]
Jan 26 09:39:20 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 09:39:23 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat2388042479-lower\x2dmapped.mount: Deactivated successfully.
Jan 26 09:39:46 compute-1 sshd-session[73520]: Invalid user admin from 104.248.198.71 port 60546
Jan 26 09:39:46 compute-1 sshd-session[73520]: Connection closed by invalid user admin 104.248.198.71 port 60546 [preauth]
Jan 26 09:39:50 compute-1 sshd-session[73523]: Connection closed by authenticating user root 152.42.136.110 port 57620 [preauth]
Jan 26 09:40:01 compute-1 podman[73459]: 2026-01-26 09:40:01.180490209 +0000 UTC m=+40.530093498 container create eb7e0a79194f723f10e215350402c36bea81ec5855a2e789879af3dc7ea0b036 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_moser, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:40:01 compute-1 systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck2832144697-merged.mount: Deactivated successfully.
Jan 26 09:40:01 compute-1 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 26 09:40:01 compute-1 systemd[1]: Started libpod-conmon-eb7e0a79194f723f10e215350402c36bea81ec5855a2e789879af3dc7ea0b036.scope.
Jan 26 09:40:01 compute-1 podman[73459]: 2026-01-26 09:40:01.164690848 +0000 UTC m=+40.514294167 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:40:01 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:40:01 compute-1 podman[73459]: 2026-01-26 09:40:01.293158237 +0000 UTC m=+40.642761546 container init eb7e0a79194f723f10e215350402c36bea81ec5855a2e789879af3dc7ea0b036 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_moser, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.build-date=20250325)
Jan 26 09:40:01 compute-1 podman[73459]: 2026-01-26 09:40:01.303372259 +0000 UTC m=+40.652975548 container start eb7e0a79194f723f10e215350402c36bea81ec5855a2e789879af3dc7ea0b036 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_moser, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 26 09:40:01 compute-1 podman[73459]: 2026-01-26 09:40:01.307416565 +0000 UTC m=+40.657019854 container attach eb7e0a79194f723f10e215350402c36bea81ec5855a2e789879af3dc7ea0b036 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_moser, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 09:40:01 compute-1 focused_moser[73529]: 167 167
Jan 26 09:40:01 compute-1 systemd[1]: libpod-eb7e0a79194f723f10e215350402c36bea81ec5855a2e789879af3dc7ea0b036.scope: Deactivated successfully.
Jan 26 09:40:01 compute-1 podman[73459]: 2026-01-26 09:40:01.313041415 +0000 UTC m=+40.662644724 container died eb7e0a79194f723f10e215350402c36bea81ec5855a2e789879af3dc7ea0b036 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_moser, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:40:01 compute-1 systemd[1]: var-lib-containers-storage-overlay-570d2b2daafeebf8abf9f02a6732ab3a7c077ce9ae325ec0f26b735608f32535-merged.mount: Deactivated successfully.
Jan 26 09:40:01 compute-1 podman[73459]: 2026-01-26 09:40:01.505525124 +0000 UTC m=+40.855128413 container remove eb7e0a79194f723f10e215350402c36bea81ec5855a2e789879af3dc7ea0b036 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_moser, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default)
Jan 26 09:40:01 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 09:40:01 compute-1 systemd[1]: libpod-conmon-eb7e0a79194f723f10e215350402c36bea81ec5855a2e789879af3dc7ea0b036.scope: Deactivated successfully.
Jan 26 09:40:01 compute-1 podman[73555]: 2026-01-26 09:40:01.696447037 +0000 UTC m=+0.051121592 container create 28766eaf7fcc0b0078715f46e702e8033efafd4b95ab97f292728459f1cbbc56 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_wilbur, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Jan 26 09:40:01 compute-1 systemd[1]: Started libpod-conmon-28766eaf7fcc0b0078715f46e702e8033efafd4b95ab97f292728459f1cbbc56.scope.
Jan 26 09:40:01 compute-1 podman[73555]: 2026-01-26 09:40:01.6723856 +0000 UTC m=+0.027060185 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:40:01 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:40:01 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01fd24e34044de2a6ca969d85672345137a3be638b2b6f9aa87d61af3cffd695/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:01 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01fd24e34044de2a6ca969d85672345137a3be638b2b6f9aa87d61af3cffd695/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:01 compute-1 podman[73555]: 2026-01-26 09:40:01.922872266 +0000 UTC m=+0.277546881 container init 28766eaf7fcc0b0078715f46e702e8033efafd4b95ab97f292728459f1cbbc56 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_wilbur, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Jan 26 09:40:01 compute-1 podman[73555]: 2026-01-26 09:40:01.930361059 +0000 UTC m=+0.285035614 container start 28766eaf7fcc0b0078715f46e702e8033efafd4b95ab97f292728459f1cbbc56 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_wilbur, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Jan 26 09:40:01 compute-1 podman[73555]: 2026-01-26 09:40:01.956766694 +0000 UTC m=+0.311441259 container attach 28766eaf7fcc0b0078715f46e702e8033efafd4b95ab97f292728459f1cbbc56 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_wilbur, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]: [
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:     {
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:         "available": false,
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:         "being_replaced": false,
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:         "ceph_device_lvm": false,
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:         "device_id": "QEMU_DVD-ROM_QM00001",
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:         "lsm_data": {},
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:         "lvs": [],
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:         "path": "/dev/sr0",
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:         "rejected_reasons": [
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             "Has a FileSystem",
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             "Insufficient space (<5GB)"
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:         ],
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:         "sys_api": {
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             "actuators": null,
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             "device_nodes": [
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:                 "sr0"
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             ],
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             "devname": "sr0",
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             "human_readable_size": "482.00 KB",
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             "id_bus": "ata",
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             "model": "QEMU DVD-ROM",
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             "nr_requests": "2",
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             "parent": "/dev/sr0",
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             "partitions": {},
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             "path": "/dev/sr0",
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             "removable": "1",
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             "rev": "2.5+",
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             "ro": "0",
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             "rotational": "1",
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             "sas_address": "",
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             "sas_device_handle": "",
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             "scheduler_mode": "mq-deadline",
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             "sectors": 0,
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             "sectorsize": "2048",
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             "size": 493568.0,
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             "support_discard": "2048",
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             "type": "disk",
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:             "vendor": "QEMU"
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:         }
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]:     }
Jan 26 09:40:02 compute-1 upbeat_wilbur[73572]: ]
Jan 26 09:40:02 compute-1 systemd[1]: libpod-28766eaf7fcc0b0078715f46e702e8033efafd4b95ab97f292728459f1cbbc56.scope: Deactivated successfully.
Jan 26 09:40:02 compute-1 conmon[73572]: conmon 28766eaf7fcc0b007871 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-28766eaf7fcc0b0078715f46e702e8033efafd4b95ab97f292728459f1cbbc56.scope/container/memory.events
Jan 26 09:40:02 compute-1 podman[73555]: 2026-01-26 09:40:02.750295811 +0000 UTC m=+1.104970366 container died 28766eaf7fcc0b0078715f46e702e8033efafd4b95ab97f292728459f1cbbc56 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_wilbur, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 09:40:02 compute-1 systemd[1]: var-lib-containers-storage-overlay-01fd24e34044de2a6ca969d85672345137a3be638b2b6f9aa87d61af3cffd695-merged.mount: Deactivated successfully.
Jan 26 09:40:03 compute-1 podman[73555]: 2026-01-26 09:40:03.052948566 +0000 UTC m=+1.407623121 container remove 28766eaf7fcc0b0078715f46e702e8033efafd4b95ab97f292728459f1cbbc56 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_wilbur, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 09:40:03 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 09:40:03 compute-1 sudo[73400]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:03 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 09:40:03 compute-1 systemd[1]: libpod-conmon-28766eaf7fcc0b0078715f46e702e8033efafd4b95ab97f292728459f1cbbc56.scope: Deactivated successfully.
Jan 26 09:40:03 compute-1 sudo[74455]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Jan 26 09:40:03 compute-1 sudo[74455]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:03 compute-1 sudo[74455]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:03 compute-1 sudo[74480]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph
Jan 26 09:40:03 compute-1 sudo[74480]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:03 compute-1 sudo[74480]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:03 compute-1 sudo[74505]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.conf.new
Jan 26 09:40:03 compute-1 sudo[74505]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:03 compute-1 sudo[74505]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:03 compute-1 sudo[74530]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:40:03 compute-1 sudo[74530]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:03 compute-1 sudo[74530]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:03 compute-1 sudo[74555]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.conf.new
Jan 26 09:40:03 compute-1 sudo[74555]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:03 compute-1 sudo[74555]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:03 compute-1 sudo[74603]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.conf.new
Jan 26 09:40:03 compute-1 sudo[74603]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:03 compute-1 sudo[74603]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:03 compute-1 sudo[74628]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.conf.new
Jan 26 09:40:03 compute-1 sudo[74628]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:03 compute-1 sudo[74628]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:03 compute-1 sudo[74653]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Jan 26 09:40:03 compute-1 sudo[74653]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:03 compute-1 sudo[74653]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:03 compute-1 sudo[74678]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config
Jan 26 09:40:03 compute-1 sudo[74678]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:03 compute-1 sudo[74678]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:03 compute-1 sudo[74703]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config
Jan 26 09:40:03 compute-1 sudo[74703]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:03 compute-1 sudo[74703]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:03 compute-1 sudo[74728]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf.new
Jan 26 09:40:03 compute-1 sudo[74728]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:03 compute-1 sudo[74728]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:03 compute-1 sudo[74753]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:40:03 compute-1 sudo[74753]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:03 compute-1 sudo[74753]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:03 compute-1 sudo[74778]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf.new
Jan 26 09:40:03 compute-1 sudo[74778]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:03 compute-1 sudo[74778]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:04 compute-1 sudo[74826]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf.new
Jan 26 09:40:04 compute-1 sudo[74826]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:04 compute-1 sudo[74826]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:04 compute-1 sudo[74851]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf.new
Jan 26 09:40:04 compute-1 sudo[74851]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:04 compute-1 sudo[74851]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:04 compute-1 sudo[74876]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf.new /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 09:40:04 compute-1 sudo[74876]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:04 compute-1 sudo[74876]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:04 compute-1 sudo[74901]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Jan 26 09:40:04 compute-1 sudo[74901]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:04 compute-1 sudo[74901]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:04 compute-1 sudo[74926]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph
Jan 26 09:40:04 compute-1 sudo[74926]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:04 compute-1 sudo[74926]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:04 compute-1 sudo[74951]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.client.admin.keyring.new
Jan 26 09:40:04 compute-1 sudo[74951]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:04 compute-1 sudo[74951]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:04 compute-1 sudo[74976]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:40:04 compute-1 sudo[74976]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:04 compute-1 sudo[74976]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:04 compute-1 sudo[75001]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.client.admin.keyring.new
Jan 26 09:40:04 compute-1 sudo[75001]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:04 compute-1 sudo[75001]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:04 compute-1 sudo[75049]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.client.admin.keyring.new
Jan 26 09:40:04 compute-1 sudo[75049]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:04 compute-1 sudo[75049]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:04 compute-1 sudo[75074]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.client.admin.keyring.new
Jan 26 09:40:04 compute-1 sudo[75074]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:04 compute-1 sudo[75074]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:04 compute-1 sudo[75099]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Jan 26 09:40:04 compute-1 sudo[75099]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:04 compute-1 sudo[75099]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:04 compute-1 sudo[75124]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config
Jan 26 09:40:04 compute-1 sudo[75124]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:04 compute-1 sudo[75124]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:04 compute-1 sudo[75149]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config
Jan 26 09:40:04 compute-1 sudo[75149]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:04 compute-1 sudo[75149]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:04 compute-1 sudo[75174]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring.new
Jan 26 09:40:04 compute-1 sudo[75174]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:04 compute-1 sudo[75174]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:04 compute-1 sudo[75199]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:40:04 compute-1 sudo[75199]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:04 compute-1 sudo[75199]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:04 compute-1 sudo[75224]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring.new
Jan 26 09:40:04 compute-1 sudo[75224]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:04 compute-1 sudo[75224]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:05 compute-1 sudo[75272]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring.new
Jan 26 09:40:05 compute-1 sudo[75272]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:05 compute-1 sudo[75272]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:05 compute-1 sudo[75297]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring.new
Jan 26 09:40:05 compute-1 sudo[75297]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:05 compute-1 sudo[75297]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:05 compute-1 sudo[75322]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring.new /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring
Jan 26 09:40:05 compute-1 sudo[75322]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:05 compute-1 sudo[75322]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:05 compute-1 sudo[75347]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:40:05 compute-1 sudo[75347]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:05 compute-1 sudo[75347]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:05 compute-1 sudo[75372]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:40:05 compute-1 sudo[75372]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:05 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 09:40:05 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 09:40:05 compute-1 podman[75436]: 2026-01-26 09:40:05.699775493 +0000 UTC m=+0.035620739 container create d6d787e469710293cdc608f1a0c05c6642a774a506f4548c29a13a0c4cf05ace (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=confident_hermann, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 09:40:05 compute-1 systemd[1]: Started libpod-conmon-d6d787e469710293cdc608f1a0c05c6642a774a506f4548c29a13a0c4cf05ace.scope.
Jan 26 09:40:05 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:40:05 compute-1 podman[75436]: 2026-01-26 09:40:05.769267508 +0000 UTC m=+0.105112784 container init d6d787e469710293cdc608f1a0c05c6642a774a506f4548c29a13a0c4cf05ace (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=confident_hermann, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 09:40:05 compute-1 podman[75436]: 2026-01-26 09:40:05.776166365 +0000 UTC m=+0.112011611 container start d6d787e469710293cdc608f1a0c05c6642a774a506f4548c29a13a0c4cf05ace (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=confident_hermann, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 09:40:05 compute-1 confident_hermann[75453]: 167 167
Jan 26 09:40:05 compute-1 podman[75436]: 2026-01-26 09:40:05.779981873 +0000 UTC m=+0.115827139 container attach d6d787e469710293cdc608f1a0c05c6642a774a506f4548c29a13a0c4cf05ace (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=confident_hermann, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 09:40:05 compute-1 systemd[1]: libpod-d6d787e469710293cdc608f1a0c05c6642a774a506f4548c29a13a0c4cf05ace.scope: Deactivated successfully.
Jan 26 09:40:05 compute-1 podman[75436]: 2026-01-26 09:40:05.780769316 +0000 UTC m=+0.116614582 container died d6d787e469710293cdc608f1a0c05c6642a774a506f4548c29a13a0c4cf05ace (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=confident_hermann, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_REF=squid)
Jan 26 09:40:05 compute-1 podman[75436]: 2026-01-26 09:40:05.685651089 +0000 UTC m=+0.021496335 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:40:05 compute-1 podman[75436]: 2026-01-26 09:40:05.814012586 +0000 UTC m=+0.149857842 container remove d6d787e469710293cdc608f1a0c05c6642a774a506f4548c29a13a0c4cf05ace (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=confident_hermann, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Jan 26 09:40:05 compute-1 systemd[1]: libpod-conmon-d6d787e469710293cdc608f1a0c05c6642a774a506f4548c29a13a0c4cf05ace.scope: Deactivated successfully.
Jan 26 09:40:05 compute-1 systemd[1]: Reloading.
Jan 26 09:40:05 compute-1 systemd-rc-local-generator[75495]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:40:05 compute-1 systemd-sysv-generator[75498]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:40:06 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 09:40:06 compute-1 systemd[1]: Reloading.
Jan 26 09:40:06 compute-1 systemd-sysv-generator[75538]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:40:06 compute-1 systemd-rc-local-generator[75534]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:40:06 compute-1 systemd[1]: Reached target All Ceph clusters and services.
Jan 26 09:40:06 compute-1 systemd[1]: Reloading.
Jan 26 09:40:06 compute-1 systemd-sysv-generator[75576]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:40:06 compute-1 systemd-rc-local-generator[75572]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:40:06 compute-1 systemd[1]: Reached target Ceph cluster 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:40:06 compute-1 systemd[1]: Reloading.
Jan 26 09:40:06 compute-1 systemd-rc-local-generator[75610]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:40:06 compute-1 systemd-sysv-generator[75613]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:40:06 compute-1 systemd[1]: Reloading.
Jan 26 09:40:07 compute-1 systemd-rc-local-generator[75650]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:40:07 compute-1 systemd-sysv-generator[75653]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:40:07 compute-1 systemd[1]: Created slice Slice /system/ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:40:07 compute-1 systemd[1]: Reached target System Time Set.
Jan 26 09:40:07 compute-1 systemd[1]: Reached target System Time Synchronized.
Jan 26 09:40:07 compute-1 systemd[1]: Starting Ceph crash.compute-1 for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 09:40:07 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 09:40:07 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 09:40:07 compute-1 podman[75706]: 2026-01-26 09:40:07.471240494 +0000 UTC m=+0.052963234 container create 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 09:40:07 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1865e4ebf94f5b6e666b769306a130094ae8b4bb66a9e0288e50127b753ab7b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:07 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1865e4ebf94f5b6e666b769306a130094ae8b4bb66a9e0288e50127b753ab7b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:07 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1865e4ebf94f5b6e666b769306a130094ae8b4bb66a9e0288e50127b753ab7b/merged/etc/ceph/ceph.client.crash.compute-1.keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:07 compute-1 podman[75706]: 2026-01-26 09:40:07.441019941 +0000 UTC m=+0.022742701 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:40:07 compute-1 podman[75706]: 2026-01-26 09:40:07.541934993 +0000 UTC m=+0.123657753 container init 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Jan 26 09:40:07 compute-1 podman[75706]: 2026-01-26 09:40:07.548496902 +0000 UTC m=+0.130219642 container start 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 09:40:07 compute-1 bash[75706]: 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40
Jan 26 09:40:07 compute-1 systemd[1]: Started Ceph crash.compute-1 for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:40:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1[75722]: INFO:ceph-crash:pinging cluster to exercise our key
Jan 26 09:40:07 compute-1 sudo[75372]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1[75722]: 2026-01-26T09:40:07.707+0000 7f4fd062c640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 26 09:40:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1[75722]: 2026-01-26T09:40:07.707+0000 7f4fd062c640 -1 AuthRegistry(0x7f4fc80698f0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 26 09:40:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1[75722]: 2026-01-26T09:40:07.708+0000 7f4fd062c640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 26 09:40:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1[75722]: 2026-01-26T09:40:07.708+0000 7f4fd062c640 -1 AuthRegistry(0x7f4fd062aff0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 26 09:40:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1[75722]: 2026-01-26T09:40:07.711+0000 7f4fce3a1640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 26 09:40:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1[75722]: 2026-01-26T09:40:07.711+0000 7f4fd062c640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Jan 26 09:40:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1[75722]: [errno 13] RADOS permission denied (error connecting to the cluster)
Jan 26 09:40:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1[75722]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Jan 26 09:40:08 compute-1 sudo[75741]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:40:08 compute-1 sudo[75741]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:08 compute-1 sudo[75741]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:08 compute-1 sudo[75766]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 1a70b85d-e3fd-5814-8a6a-37ea00fcae30 --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 --yes --no-systemd
Jan 26 09:40:08 compute-1 sudo[75766]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:08 compute-1 podman[75832]: 2026-01-26 09:40:08.613061471 +0000 UTC m=+0.045089579 container create e0d6b2bed70b5147af94a114cafbae77823430339c2ec1a07afc187f657b2757 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=practical_johnson, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Jan 26 09:40:08 compute-1 systemd[1]: Started libpod-conmon-e0d6b2bed70b5147af94a114cafbae77823430339c2ec1a07afc187f657b2757.scope.
Jan 26 09:40:08 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:40:08 compute-1 podman[75832]: 2026-01-26 09:40:08.678435888 +0000 UTC m=+0.110464026 container init e0d6b2bed70b5147af94a114cafbae77823430339c2ec1a07afc187f657b2757 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=practical_johnson, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 09:40:08 compute-1 podman[75832]: 2026-01-26 09:40:08.685553032 +0000 UTC m=+0.117581140 container start e0d6b2bed70b5147af94a114cafbae77823430339c2ec1a07afc187f657b2757 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=practical_johnson, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 09:40:08 compute-1 podman[75832]: 2026-01-26 09:40:08.592554954 +0000 UTC m=+0.024583112 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:40:08 compute-1 podman[75832]: 2026-01-26 09:40:08.689522105 +0000 UTC m=+0.121550213 container attach e0d6b2bed70b5147af94a114cafbae77823430339c2ec1a07afc187f657b2757 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=practical_johnson, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:40:08 compute-1 practical_johnson[75848]: 167 167
Jan 26 09:40:08 compute-1 systemd[1]: libpod-e0d6b2bed70b5147af94a114cafbae77823430339c2ec1a07afc187f657b2757.scope: Deactivated successfully.
Jan 26 09:40:08 compute-1 podman[75832]: 2026-01-26 09:40:08.690874143 +0000 UTC m=+0.122902261 container died e0d6b2bed70b5147af94a114cafbae77823430339c2ec1a07afc187f657b2757 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=practical_johnson, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:40:08 compute-1 systemd[1]: var-lib-containers-storage-overlay-e4a740afea6bd451727603c5e72b1d9fd896816c7a0fd62f19a0295c57db0f6a-merged.mount: Deactivated successfully.
Jan 26 09:40:08 compute-1 podman[75832]: 2026-01-26 09:40:08.724410331 +0000 UTC m=+0.156438439 container remove e0d6b2bed70b5147af94a114cafbae77823430339c2ec1a07afc187f657b2757 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=practical_johnson, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Jan 26 09:40:08 compute-1 systemd[1]: libpod-conmon-e0d6b2bed70b5147af94a114cafbae77823430339c2ec1a07afc187f657b2757.scope: Deactivated successfully.
Jan 26 09:40:08 compute-1 podman[75874]: 2026-01-26 09:40:08.877821393 +0000 UTC m=+0.046131208 container create 387fe622ba0789b52550a7f5fe9f0ef5dd6a85cb3ee600d6c2f1bd86ac01db23 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_swanson, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, ceph=True)
Jan 26 09:40:08 compute-1 systemd[1]: Started libpod-conmon-387fe622ba0789b52550a7f5fe9f0ef5dd6a85cb3ee600d6c2f1bd86ac01db23.scope.
Jan 26 09:40:08 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:40:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/144e48bf0894d5fd88832d93b6b5fb32ee1de11ba8269e368d62938cc3842a98/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/144e48bf0894d5fd88832d93b6b5fb32ee1de11ba8269e368d62938cc3842a98/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/144e48bf0894d5fd88832d93b6b5fb32ee1de11ba8269e368d62938cc3842a98/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/144e48bf0894d5fd88832d93b6b5fb32ee1de11ba8269e368d62938cc3842a98/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/144e48bf0894d5fd88832d93b6b5fb32ee1de11ba8269e368d62938cc3842a98/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:08 compute-1 podman[75874]: 2026-01-26 09:40:08.858171682 +0000 UTC m=+0.026481517 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:40:08 compute-1 podman[75874]: 2026-01-26 09:40:08.959858436 +0000 UTC m=+0.128168261 container init 387fe622ba0789b52550a7f5fe9f0ef5dd6a85cb3ee600d6c2f1bd86ac01db23 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_swanson, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 09:40:08 compute-1 podman[75874]: 2026-01-26 09:40:08.970799419 +0000 UTC m=+0.139109234 container start 387fe622ba0789b52550a7f5fe9f0ef5dd6a85cb3ee600d6c2f1bd86ac01db23 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_swanson, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Jan 26 09:40:08 compute-1 podman[75874]: 2026-01-26 09:40:08.974432603 +0000 UTC m=+0.142742418 container attach 387fe622ba0789b52550a7f5fe9f0ef5dd6a85cb3ee600d6c2f1bd86ac01db23 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_swanson, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 09:40:08 compute-1 sshd-session[75725]: Connection closed by authenticating user root 178.62.249.31 port 44122 [preauth]
Jan 26 09:40:09 compute-1 heuristic_swanson[75890]: --> passed data devices: 0 physical, 1 LVM
Jan 26 09:40:09 compute-1 heuristic_swanson[75890]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 26 09:40:09 compute-1 heuristic_swanson[75890]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 26 09:40:09 compute-1 heuristic_swanson[75890]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 8ed5e8fe-4547-4eab-be95-05fd5f9f3f95
Jan 26 09:40:09 compute-1 heuristic_swanson[75890]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-1
Jan 26 09:40:09 compute-1 heuristic_swanson[75890]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Jan 26 09:40:09 compute-1 heuristic_swanson[75890]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 26 09:40:09 compute-1 heuristic_swanson[75890]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Jan 26 09:40:09 compute-1 heuristic_swanson[75890]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-1/activate.monmap
Jan 26 09:40:09 compute-1 lvm[75953]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 26 09:40:09 compute-1 lvm[75953]: VG ceph_vg0 finished
Jan 26 09:40:10 compute-1 heuristic_swanson[75890]:  stderr: got monmap epoch 1
Jan 26 09:40:10 compute-1 heuristic_swanson[75890]: --> Creating keyring file for osd.1
Jan 26 09:40:10 compute-1 heuristic_swanson[75890]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/keyring
Jan 26 09:40:10 compute-1 heuristic_swanson[75890]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/
Jan 26 09:40:10 compute-1 heuristic_swanson[75890]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 1 --monmap /var/lib/ceph/osd/ceph-1/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-1/ --osd-uuid 8ed5e8fe-4547-4eab-be95-05fd5f9f3f95 --setuser ceph --setgroup ceph
Jan 26 09:40:13 compute-1 heuristic_swanson[75890]:  stderr: 2026-01-26T09:40:10.530+0000 7ffb5d790740 -1 bluestore(/var/lib/ceph/osd/ceph-1//block) No valid bdev label found
Jan 26 09:40:13 compute-1 heuristic_swanson[75890]:  stderr: 2026-01-26T09:40:10.791+0000 7ffb5d790740 -1 bluestore(/var/lib/ceph/osd/ceph-1/) _read_fsid unparsable uuid
Jan 26 09:40:13 compute-1 heuristic_swanson[75890]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Jan 26 09:40:13 compute-1 heuristic_swanson[75890]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 26 09:40:13 compute-1 heuristic_swanson[75890]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-1 --no-mon-config
Jan 26 09:40:13 compute-1 heuristic_swanson[75890]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Jan 26 09:40:13 compute-1 heuristic_swanson[75890]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block
Jan 26 09:40:13 compute-1 heuristic_swanson[75890]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 26 09:40:13 compute-1 heuristic_swanson[75890]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 26 09:40:13 compute-1 heuristic_swanson[75890]: --> ceph-volume lvm activate successful for osd ID: 1
Jan 26 09:40:13 compute-1 heuristic_swanson[75890]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Jan 26 09:40:13 compute-1 systemd[1]: libpod-387fe622ba0789b52550a7f5fe9f0ef5dd6a85cb3ee600d6c2f1bd86ac01db23.scope: Deactivated successfully.
Jan 26 09:40:13 compute-1 systemd[1]: libpod-387fe622ba0789b52550a7f5fe9f0ef5dd6a85cb3ee600d6c2f1bd86ac01db23.scope: Consumed 2.097s CPU time.
Jan 26 09:40:13 compute-1 podman[75874]: 2026-01-26 09:40:13.508718545 +0000 UTC m=+4.677028360 container died 387fe622ba0789b52550a7f5fe9f0ef5dd6a85cb3ee600d6c2f1bd86ac01db23 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_swanson, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, OSD_FLAVOR=default)
Jan 26 09:40:13 compute-1 systemd[1]: var-lib-containers-storage-overlay-144e48bf0894d5fd88832d93b6b5fb32ee1de11ba8269e368d62938cc3842a98-merged.mount: Deactivated successfully.
Jan 26 09:40:13 compute-1 podman[75874]: 2026-01-26 09:40:13.557619942 +0000 UTC m=+4.725929757 container remove 387fe622ba0789b52550a7f5fe9f0ef5dd6a85cb3ee600d6c2f1bd86ac01db23 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_swanson, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Jan 26 09:40:13 compute-1 systemd[1]: libpod-conmon-387fe622ba0789b52550a7f5fe9f0ef5dd6a85cb3ee600d6c2f1bd86ac01db23.scope: Deactivated successfully.
Jan 26 09:40:13 compute-1 sudo[75766]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:13 compute-1 sudo[76879]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:40:13 compute-1 sudo[76879]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:13 compute-1 sudo[76879]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:13 compute-1 sudo[76904]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 1a70b85d-e3fd-5814-8a6a-37ea00fcae30 -- lvm list --format json
Jan 26 09:40:13 compute-1 sudo[76904]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:14 compute-1 podman[76971]: 2026-01-26 09:40:14.204546322 +0000 UTC m=+0.041633931 container create 5ea842917222bb1e3fe2ce292e31c4864b2fb48be6833e5fdc8098a877367f98 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nice_wilson, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:40:14 compute-1 systemd[1]: Started libpod-conmon-5ea842917222bb1e3fe2ce292e31c4864b2fb48be6833e5fdc8098a877367f98.scope.
Jan 26 09:40:14 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:40:14 compute-1 podman[76971]: 2026-01-26 09:40:14.186080254 +0000 UTC m=+0.023167893 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:40:14 compute-1 podman[76971]: 2026-01-26 09:40:14.295808578 +0000 UTC m=+0.132896197 container init 5ea842917222bb1e3fe2ce292e31c4864b2fb48be6833e5fdc8098a877367f98 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nice_wilson, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:40:14 compute-1 podman[76971]: 2026-01-26 09:40:14.305857945 +0000 UTC m=+0.142945544 container start 5ea842917222bb1e3fe2ce292e31c4864b2fb48be6833e5fdc8098a877367f98 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nice_wilson, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 09:40:14 compute-1 podman[76971]: 2026-01-26 09:40:14.309107108 +0000 UTC m=+0.146194737 container attach 5ea842917222bb1e3fe2ce292e31c4864b2fb48be6833e5fdc8098a877367f98 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nice_wilson, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2)
Jan 26 09:40:14 compute-1 nice_wilson[76987]: 167 167
Jan 26 09:40:14 compute-1 systemd[1]: libpod-5ea842917222bb1e3fe2ce292e31c4864b2fb48be6833e5fdc8098a877367f98.scope: Deactivated successfully.
Jan 26 09:40:14 compute-1 podman[76971]: 2026-01-26 09:40:14.31197579 +0000 UTC m=+0.149063389 container died 5ea842917222bb1e3fe2ce292e31c4864b2fb48be6833e5fdc8098a877367f98 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nice_wilson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Jan 26 09:40:14 compute-1 systemd[1]: var-lib-containers-storage-overlay-2d001f63c1c224f381f70e1f0a04c610b99716937520b8ee5f518b533cee79c5-merged.mount: Deactivated successfully.
Jan 26 09:40:14 compute-1 podman[76971]: 2026-01-26 09:40:14.345204359 +0000 UTC m=+0.182291958 container remove 5ea842917222bb1e3fe2ce292e31c4864b2fb48be6833e5fdc8098a877367f98 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nice_wilson, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 09:40:14 compute-1 systemd[1]: libpod-conmon-5ea842917222bb1e3fe2ce292e31c4864b2fb48be6833e5fdc8098a877367f98.scope: Deactivated successfully.
Jan 26 09:40:14 compute-1 podman[77012]: 2026-01-26 09:40:14.534912669 +0000 UTC m=+0.050059661 container create 45ac69645ecb663e0f1e522095327c43adcaa3e74c1157414913537740df98df (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_sanderson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Jan 26 09:40:14 compute-1 systemd[1]: Started libpod-conmon-45ac69645ecb663e0f1e522095327c43adcaa3e74c1157414913537740df98df.scope.
Jan 26 09:40:14 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:40:14 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f713212828322b28c4ee7d8dcc6948ead68b259fcac0e41b7ae9f36997ffd63/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:14 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f713212828322b28c4ee7d8dcc6948ead68b259fcac0e41b7ae9f36997ffd63/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:14 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f713212828322b28c4ee7d8dcc6948ead68b259fcac0e41b7ae9f36997ffd63/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:14 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f713212828322b28c4ee7d8dcc6948ead68b259fcac0e41b7ae9f36997ffd63/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:14 compute-1 podman[77012]: 2026-01-26 09:40:14.606490923 +0000 UTC m=+0.121637935 container init 45ac69645ecb663e0f1e522095327c43adcaa3e74c1157414913537740df98df (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_sanderson, org.label-schema.build-date=20250325, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Jan 26 09:40:14 compute-1 podman[77012]: 2026-01-26 09:40:14.514472084 +0000 UTC m=+0.029619136 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:40:14 compute-1 podman[77012]: 2026-01-26 09:40:14.614313986 +0000 UTC m=+0.129460978 container start 45ac69645ecb663e0f1e522095327c43adcaa3e74c1157414913537740df98df (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_sanderson, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 09:40:14 compute-1 podman[77012]: 2026-01-26 09:40:14.620292357 +0000 UTC m=+0.135439499 container attach 45ac69645ecb663e0f1e522095327c43adcaa3e74c1157414913537740df98df (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_sanderson, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]: {
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:     "1": [
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:         {
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:             "devices": [
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:                 "/dev/loop3"
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:             ],
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:             "lv_name": "ceph_lv0",
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:             "lv_size": "21470642176",
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PaSr8h-YOLM-efKX-5RlJ-3Pqs-GEP9-6iJAZw,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=1a70b85d-e3fd-5814-8a6a-37ea00fcae30,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=8ed5e8fe-4547-4eab-be95-05fd5f9f3f95,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:             "lv_uuid": "PaSr8h-YOLM-efKX-5RlJ-3Pqs-GEP9-6iJAZw",
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:             "name": "ceph_lv0",
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:             "path": "/dev/ceph_vg0/ceph_lv0",
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:             "tags": {
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:                 "ceph.block_uuid": "PaSr8h-YOLM-efKX-5RlJ-3Pqs-GEP9-6iJAZw",
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:                 "ceph.cephx_lockbox_secret": "",
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:                 "ceph.cluster_fsid": "1a70b85d-e3fd-5814-8a6a-37ea00fcae30",
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:                 "ceph.cluster_name": "ceph",
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:                 "ceph.crush_device_class": "",
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:                 "ceph.encrypted": "0",
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:                 "ceph.osd_fsid": "8ed5e8fe-4547-4eab-be95-05fd5f9f3f95",
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:                 "ceph.osd_id": "1",
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:                 "ceph.osdspec_affinity": "default_drive_group",
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:                 "ceph.type": "block",
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:                 "ceph.vdo": "0",
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:                 "ceph.with_tpm": "0"
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:             },
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:             "type": "block",
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:             "vg_name": "ceph_vg0"
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:         }
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]:     ]
Jan 26 09:40:14 compute-1 infallible_sanderson[77029]: }
Jan 26 09:40:14 compute-1 systemd[1]: libpod-45ac69645ecb663e0f1e522095327c43adcaa3e74c1157414913537740df98df.scope: Deactivated successfully.
Jan 26 09:40:14 compute-1 podman[77012]: 2026-01-26 09:40:14.932815155 +0000 UTC m=+0.447962147 container died 45ac69645ecb663e0f1e522095327c43adcaa3e74c1157414913537740df98df (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_sanderson, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Jan 26 09:40:14 compute-1 systemd[1]: var-lib-containers-storage-overlay-2f713212828322b28c4ee7d8dcc6948ead68b259fcac0e41b7ae9f36997ffd63-merged.mount: Deactivated successfully.
Jan 26 09:40:14 compute-1 podman[77012]: 2026-01-26 09:40:14.975794992 +0000 UTC m=+0.490941974 container remove 45ac69645ecb663e0f1e522095327c43adcaa3e74c1157414913537740df98df (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_sanderson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default)
Jan 26 09:40:14 compute-1 systemd[1]: libpod-conmon-45ac69645ecb663e0f1e522095327c43adcaa3e74c1157414913537740df98df.scope: Deactivated successfully.
Jan 26 09:40:15 compute-1 sudo[76904]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:15 compute-1 sudo[77050]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:40:15 compute-1 sudo[77050]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:15 compute-1 sudo[77050]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:15 compute-1 sudo[77075]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:40:15 compute-1 sudo[77075]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:15 compute-1 podman[77142]: 2026-01-26 09:40:15.547318088 +0000 UTC m=+0.050369350 container create 2a01c745f4471b87d52aae05b93a441dd06e70e5b0bbe3f0ad9984f6e5884df1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_bhabha, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 09:40:15 compute-1 systemd[1]: Started libpod-conmon-2a01c745f4471b87d52aae05b93a441dd06e70e5b0bbe3f0ad9984f6e5884df1.scope.
Jan 26 09:40:15 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:40:15 compute-1 podman[77142]: 2026-01-26 09:40:15.529662314 +0000 UTC m=+0.032713596 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:40:15 compute-1 podman[77142]: 2026-01-26 09:40:15.626905441 +0000 UTC m=+0.129956743 container init 2a01c745f4471b87d52aae05b93a441dd06e70e5b0bbe3f0ad9984f6e5884df1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_bhabha, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20250325)
Jan 26 09:40:15 compute-1 podman[77142]: 2026-01-26 09:40:15.635324631 +0000 UTC m=+0.138375893 container start 2a01c745f4471b87d52aae05b93a441dd06e70e5b0bbe3f0ad9984f6e5884df1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_bhabha, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 09:40:15 compute-1 podman[77142]: 2026-01-26 09:40:15.638836982 +0000 UTC m=+0.141888244 container attach 2a01c745f4471b87d52aae05b93a441dd06e70e5b0bbe3f0ad9984f6e5884df1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_bhabha, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 09:40:15 compute-1 modest_bhabha[77159]: 167 167
Jan 26 09:40:15 compute-1 systemd[1]: libpod-2a01c745f4471b87d52aae05b93a441dd06e70e5b0bbe3f0ad9984f6e5884df1.scope: Deactivated successfully.
Jan 26 09:40:15 compute-1 podman[77142]: 2026-01-26 09:40:15.644003659 +0000 UTC m=+0.147054941 container died 2a01c745f4471b87d52aae05b93a441dd06e70e5b0bbe3f0ad9984f6e5884df1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_bhabha, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 09:40:15 compute-1 systemd[1]: var-lib-containers-storage-overlay-2fdcb90caf1d2ff9950e0425d3100e8d83b71cbb46052cad049b7a4572a60abd-merged.mount: Deactivated successfully.
Jan 26 09:40:15 compute-1 podman[77142]: 2026-01-26 09:40:15.676304432 +0000 UTC m=+0.179355694 container remove 2a01c745f4471b87d52aae05b93a441dd06e70e5b0bbe3f0ad9984f6e5884df1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_bhabha, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 09:40:15 compute-1 systemd[1]: libpod-conmon-2a01c745f4471b87d52aae05b93a441dd06e70e5b0bbe3f0ad9984f6e5884df1.scope: Deactivated successfully.
Jan 26 09:40:15 compute-1 podman[77190]: 2026-01-26 09:40:15.922810444 +0000 UTC m=+0.044631776 container create c54918519b985f75d59614ad7b4bd790b1e0c32713154a8bac75d4e489d274ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate-test, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:40:15 compute-1 systemd[1]: Started libpod-conmon-c54918519b985f75d59614ad7b4bd790b1e0c32713154a8bac75d4e489d274ff.scope.
Jan 26 09:40:15 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:40:15 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8d364819c7b47ea952b3f1eed120910f7cb7378b0367df5916e11fbe3963c3f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:15 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8d364819c7b47ea952b3f1eed120910f7cb7378b0367df5916e11fbe3963c3f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:15 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8d364819c7b47ea952b3f1eed120910f7cb7378b0367df5916e11fbe3963c3f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:16 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8d364819c7b47ea952b3f1eed120910f7cb7378b0367df5916e11fbe3963c3f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:16 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8d364819c7b47ea952b3f1eed120910f7cb7378b0367df5916e11fbe3963c3f/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:16 compute-1 podman[77190]: 2026-01-26 09:40:15.903201713 +0000 UTC m=+0.025023065 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:40:16 compute-1 podman[77190]: 2026-01-26 09:40:16.011869408 +0000 UTC m=+0.133690760 container init c54918519b985f75d59614ad7b4bd790b1e0c32713154a8bac75d4e489d274ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate-test, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 09:40:16 compute-1 podman[77190]: 2026-01-26 09:40:16.018387374 +0000 UTC m=+0.140208706 container start c54918519b985f75d59614ad7b4bd790b1e0c32713154a8bac75d4e489d274ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate-test, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 09:40:16 compute-1 podman[77190]: 2026-01-26 09:40:16.021577435 +0000 UTC m=+0.143398767 container attach c54918519b985f75d59614ad7b4bd790b1e0c32713154a8bac75d4e489d274ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate-test, OSD_FLAVOR=default, io.buildah.version=1.40.1, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid)
Jan 26 09:40:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate-test[77206]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_FSID]
Jan 26 09:40:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate-test[77206]:                             [--no-systemd] [--no-tmpfs]
Jan 26 09:40:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate-test[77206]: ceph-volume activate: error: unrecognized arguments: --bad-option
Jan 26 09:40:16 compute-1 systemd[1]: libpod-c54918519b985f75d59614ad7b4bd790b1e0c32713154a8bac75d4e489d274ff.scope: Deactivated successfully.
Jan 26 09:40:16 compute-1 podman[77190]: 2026-01-26 09:40:16.230679338 +0000 UTC m=+0.352500670 container died c54918519b985f75d59614ad7b4bd790b1e0c32713154a8bac75d4e489d274ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate-test, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 09:40:16 compute-1 systemd[1]: var-lib-containers-storage-overlay-a8d364819c7b47ea952b3f1eed120910f7cb7378b0367df5916e11fbe3963c3f-merged.mount: Deactivated successfully.
Jan 26 09:40:16 compute-1 podman[77190]: 2026-01-26 09:40:16.271929287 +0000 UTC m=+0.393750619 container remove c54918519b985f75d59614ad7b4bd790b1e0c32713154a8bac75d4e489d274ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate-test, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, ceph=True)
Jan 26 09:40:16 compute-1 systemd[1]: libpod-conmon-c54918519b985f75d59614ad7b4bd790b1e0c32713154a8bac75d4e489d274ff.scope: Deactivated successfully.
Jan 26 09:40:16 compute-1 systemd[1]: Reloading.
Jan 26 09:40:16 compute-1 systemd-sysv-generator[77274]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:40:16 compute-1 systemd-rc-local-generator[77269]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:40:16 compute-1 systemd[1]: Reloading.
Jan 26 09:40:16 compute-1 systemd-rc-local-generator[77309]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:40:16 compute-1 systemd-sysv-generator[77313]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:40:17 compute-1 systemd[1]: Starting Ceph osd.1 for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 09:40:17 compute-1 podman[77367]: 2026-01-26 09:40:17.300786536 +0000 UTC m=+0.042329600 container create 0ac1e5649ab9133aa599199960d1ac5c39c443429ae70878c69dac54c116a593 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 09:40:17 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:40:17 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27b495c530188627c4776c9822828e7d0693794e2ce4752a123cbdcadc59ecd2/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:17 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27b495c530188627c4776c9822828e7d0693794e2ce4752a123cbdcadc59ecd2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:17 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27b495c530188627c4776c9822828e7d0693794e2ce4752a123cbdcadc59ecd2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:17 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27b495c530188627c4776c9822828e7d0693794e2ce4752a123cbdcadc59ecd2/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:17 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27b495c530188627c4776c9822828e7d0693794e2ce4752a123cbdcadc59ecd2/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:17 compute-1 podman[77367]: 2026-01-26 09:40:17.27958834 +0000 UTC m=+0.021131424 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:40:17 compute-1 podman[77367]: 2026-01-26 09:40:17.389245472 +0000 UTC m=+0.130788556 container init 0ac1e5649ab9133aa599199960d1ac5c39c443429ae70878c69dac54c116a593 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 09:40:17 compute-1 podman[77367]: 2026-01-26 09:40:17.394960856 +0000 UTC m=+0.136503920 container start 0ac1e5649ab9133aa599199960d1ac5c39c443429ae70878c69dac54c116a593 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2)
Jan 26 09:40:17 compute-1 podman[77367]: 2026-01-26 09:40:17.398555949 +0000 UTC m=+0.140099013 container attach 0ac1e5649ab9133aa599199960d1ac5c39c443429ae70878c69dac54c116a593 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate, CEPH_REF=squid, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 09:40:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate[77382]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 26 09:40:17 compute-1 bash[77367]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 26 09:40:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate[77382]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 26 09:40:17 compute-1 bash[77367]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 26 09:40:18 compute-1 lvm[77463]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 26 09:40:18 compute-1 lvm[77463]: VG ceph_vg0 finished
Jan 26 09:40:18 compute-1 lvm[77467]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 26 09:40:18 compute-1 lvm[77467]: VG ceph_vg0 finished
Jan 26 09:40:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate[77382]: --> Failed to activate via raw: did not find any matching OSD to activate
Jan 26 09:40:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate[77382]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 26 09:40:18 compute-1 bash[77367]: --> Failed to activate via raw: did not find any matching OSD to activate
Jan 26 09:40:18 compute-1 bash[77367]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 26 09:40:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate[77382]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 26 09:40:18 compute-1 bash[77367]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 26 09:40:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate[77382]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 26 09:40:18 compute-1 bash[77367]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 26 09:40:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate[77382]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-1 --no-mon-config
Jan 26 09:40:18 compute-1 bash[77367]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-1 --no-mon-config
Jan 26 09:40:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate[77382]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Jan 26 09:40:18 compute-1 bash[77367]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Jan 26 09:40:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate[77382]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block
Jan 26 09:40:18 compute-1 bash[77367]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block
Jan 26 09:40:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate[77382]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 26 09:40:18 compute-1 bash[77367]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 26 09:40:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate[77382]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 26 09:40:18 compute-1 bash[77367]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 26 09:40:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate[77382]: --> ceph-volume lvm activate successful for osd ID: 1
Jan 26 09:40:18 compute-1 bash[77367]: --> ceph-volume lvm activate successful for osd ID: 1
Jan 26 09:40:18 compute-1 systemd[1]: libpod-0ac1e5649ab9133aa599199960d1ac5c39c443429ae70878c69dac54c116a593.scope: Deactivated successfully.
Jan 26 09:40:18 compute-1 systemd[1]: libpod-0ac1e5649ab9133aa599199960d1ac5c39c443429ae70878c69dac54c116a593.scope: Consumed 1.480s CPU time.
Jan 26 09:40:18 compute-1 podman[77367]: 2026-01-26 09:40:18.680283311 +0000 UTC m=+1.421826385 container died 0ac1e5649ab9133aa599199960d1ac5c39c443429ae70878c69dac54c116a593 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 09:40:18 compute-1 systemd[1]: var-lib-containers-storage-overlay-27b495c530188627c4776c9822828e7d0693794e2ce4752a123cbdcadc59ecd2-merged.mount: Deactivated successfully.
Jan 26 09:40:18 compute-1 podman[77367]: 2026-01-26 09:40:18.726205083 +0000 UTC m=+1.467748147 container remove 0ac1e5649ab9133aa599199960d1ac5c39c443429ae70878c69dac54c116a593 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 09:40:18 compute-1 podman[77613]: 2026-01-26 09:40:18.93825711 +0000 UTC m=+0.041929499 container create ba4e5e4834ef65c29e4703b2ab0f0e5713b41e16b2137b2982ada209763b448d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 09:40:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/171b640df2369381bfad10f568be89f728f5dc5625b3469beef0a366f0bb7948/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/171b640df2369381bfad10f568be89f728f5dc5625b3469beef0a366f0bb7948/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/171b640df2369381bfad10f568be89f728f5dc5625b3469beef0a366f0bb7948/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/171b640df2369381bfad10f568be89f728f5dc5625b3469beef0a366f0bb7948/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/171b640df2369381bfad10f568be89f728f5dc5625b3469beef0a366f0bb7948/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:19 compute-1 podman[77613]: 2026-01-26 09:40:18.999628443 +0000 UTC m=+0.103300862 container init ba4e5e4834ef65c29e4703b2ab0f0e5713b41e16b2137b2982ada209763b448d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Jan 26 09:40:19 compute-1 podman[77613]: 2026-01-26 09:40:19.005660295 +0000 UTC m=+0.109332694 container start ba4e5e4834ef65c29e4703b2ab0f0e5713b41e16b2137b2982ada209763b448d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Jan 26 09:40:19 compute-1 bash[77613]: ba4e5e4834ef65c29e4703b2ab0f0e5713b41e16b2137b2982ada209763b448d
Jan 26 09:40:19 compute-1 podman[77613]: 2026-01-26 09:40:18.921590904 +0000 UTC m=+0.025263333 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:40:19 compute-1 systemd[1]: Started Ceph osd.1 for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:40:19 compute-1 ceph-osd[77632]: set uid:gid to 167:167 (ceph:ceph)
Jan 26 09:40:19 compute-1 ceph-osd[77632]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-osd, pid 2
Jan 26 09:40:19 compute-1 ceph-osd[77632]: pidfile_write: ignore empty --pid-file
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 09:40:19 compute-1 sudo[77075]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:19 compute-1 sudo[77644]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:40:19 compute-1 sudo[77644]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:19 compute-1 sudo[77644]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:19 compute-1 sudo[77669]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 1a70b85d-e3fd-5814-8a6a-37ea00fcae30 -- raw list --format json
Jan 26 09:40:19 compute-1 sudo[77669]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1c00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1c00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 20 GiB
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1c00 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 09:40:19 compute-1 podman[77742]: 2026-01-26 09:40:19.639742698 +0000 UTC m=+0.022492533 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:40:19 compute-1 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 09:40:20 compute-1 podman[77742]: 2026-01-26 09:40:20.051102179 +0000 UTC m=+0.433851994 container create 2833edf786340ea0d93d286a376f4d5550802d5be01b3a14d969f6af894ac500 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_benz, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:40:20 compute-1 systemd[1]: Started libpod-conmon-2833edf786340ea0d93d286a376f4d5550802d5be01b3a14d969f6af894ac500.scope.
Jan 26 09:40:20 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:40:20 compute-1 podman[77742]: 2026-01-26 09:40:20.149593862 +0000 UTC m=+0.532343687 container init 2833edf786340ea0d93d286a376f4d5550802d5be01b3a14d969f6af894ac500 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_benz, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 09:40:20 compute-1 podman[77742]: 2026-01-26 09:40:20.160911075 +0000 UTC m=+0.543660890 container start 2833edf786340ea0d93d286a376f4d5550802d5be01b3a14d969f6af894ac500 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_benz, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 09:40:20 compute-1 podman[77742]: 2026-01-26 09:40:20.165194837 +0000 UTC m=+0.547944672 container attach 2833edf786340ea0d93d286a376f4d5550802d5be01b3a14d969f6af894ac500 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_benz, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 09:40:20 compute-1 pensive_benz[77763]: 167 167
Jan 26 09:40:20 compute-1 systemd[1]: libpod-2833edf786340ea0d93d286a376f4d5550802d5be01b3a14d969f6af894ac500.scope: Deactivated successfully.
Jan 26 09:40:20 compute-1 podman[77742]: 2026-01-26 09:40:20.169754348 +0000 UTC m=+0.552504193 container died 2833edf786340ea0d93d286a376f4d5550802d5be01b3a14d969f6af894ac500 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_benz, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 09:40:20 compute-1 ceph-osd[77632]: starting osd.1 osd_data /var/lib/ceph/osd/ceph-1 /var/lib/ceph/osd/ceph-1/journal
Jan 26 09:40:20 compute-1 systemd[1]: var-lib-containers-storage-overlay-412fa71ff8050b5792a2e40fb88ad19918a3b60985c8c2b5730d6f8b88b3360b-merged.mount: Deactivated successfully.
Jan 26 09:40:20 compute-1 ceph-osd[77632]: load: jerasure load: lrc 
Jan 26 09:40:20 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 09:40:20 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 09:40:20 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 09:40:20 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 09:40:20 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 26 09:40:20 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 09:40:20 compute-1 podman[77742]: 2026-01-26 09:40:20.218718607 +0000 UTC m=+0.601468422 container remove 2833edf786340ea0d93d286a376f4d5550802d5be01b3a14d969f6af894ac500 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_benz, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 09:40:20 compute-1 systemd[1]: libpod-conmon-2833edf786340ea0d93d286a376f4d5550802d5be01b3a14d969f6af894ac500.scope: Deactivated successfully.
Jan 26 09:40:20 compute-1 podman[77791]: 2026-01-26 09:40:20.399576353 +0000 UTC m=+0.060000805 container create 850cccbb0d31b69b90968858daecf9dc51ce737d873ebc97405868154a952b7b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_lamport, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 09:40:20 compute-1 systemd[1]: Started libpod-conmon-850cccbb0d31b69b90968858daecf9dc51ce737d873ebc97405868154a952b7b.scope.
Jan 26 09:40:20 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:40:20 compute-1 podman[77791]: 2026-01-26 09:40:20.379804988 +0000 UTC m=+0.040229440 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:40:20 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6abc23e562a823c5ab5fd0e92fc563387bc662de40b5132122a694a67974f4c5/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:20 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 09:40:20 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 09:40:20 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 09:40:20 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 09:40:20 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 26 09:40:20 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 09:40:20 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6abc23e562a823c5ab5fd0e92fc563387bc662de40b5132122a694a67974f4c5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:20 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6abc23e562a823c5ab5fd0e92fc563387bc662de40b5132122a694a67974f4c5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:20 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6abc23e562a823c5ab5fd0e92fc563387bc662de40b5132122a694a67974f4c5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:20 compute-1 podman[77791]: 2026-01-26 09:40:20.495939395 +0000 UTC m=+0.156363847 container init 850cccbb0d31b69b90968858daecf9dc51ce737d873ebc97405868154a952b7b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_lamport, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 09:40:20 compute-1 podman[77791]: 2026-01-26 09:40:20.502921115 +0000 UTC m=+0.163345547 container start 850cccbb0d31b69b90968858daecf9dc51ce737d873ebc97405868154a952b7b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_lamport, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1)
Jan 26 09:40:20 compute-1 podman[77791]: 2026-01-26 09:40:20.506363443 +0000 UTC m=+0.166787895 container attach 850cccbb0d31b69b90968858daecf9dc51ce737d873ebc97405868154a952b7b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_lamport, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid)
Jan 26 09:40:20 compute-1 ceph-osd[77632]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Jan 26 09:40:20 compute-1 ceph-osd[77632]: osd.1:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Jan 26 09:40:20 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 09:40:20 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 09:40:20 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 09:40:20 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 09:40:20 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 09:40:21 compute-1 lvm[77897]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 26 09:40:21 compute-1 lvm[77897]: VG ceph_vg0 finished
Jan 26 09:40:21 compute-1 ecstatic_lamport[77807]: {}
Jan 26 09:40:21 compute-1 systemd[1]: libpod-850cccbb0d31b69b90968858daecf9dc51ce737d873ebc97405868154a952b7b.scope: Deactivated successfully.
Jan 26 09:40:21 compute-1 podman[77791]: 2026-01-26 09:40:21.247672369 +0000 UTC m=+0.908096821 container died 850cccbb0d31b69b90968858daecf9dc51ce737d873ebc97405868154a952b7b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_lamport, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 09:40:21 compute-1 systemd[1]: libpod-850cccbb0d31b69b90968858daecf9dc51ce737d873ebc97405868154a952b7b.scope: Consumed 1.220s CPU time.
Jan 26 09:40:21 compute-1 systemd[1]: var-lib-containers-storage-overlay-6abc23e562a823c5ab5fd0e92fc563387bc662de40b5132122a694a67974f4c5-merged.mount: Deactivated successfully.
Jan 26 09:40:21 compute-1 podman[77791]: 2026-01-26 09:40:21.289556755 +0000 UTC m=+0.949981187 container remove 850cccbb0d31b69b90968858daecf9dc51ce737d873ebc97405868154a952b7b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_lamport, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 09:40:21 compute-1 systemd[1]: libpod-conmon-850cccbb0d31b69b90968858daecf9dc51ce737d873ebc97405868154a952b7b.scope: Deactivated successfully.
Jan 26 09:40:21 compute-1 sudo[77669]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bdev(0x55d16f83d000 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bdev(0x55d16f83d000 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bdev(0x55d16f83d000 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bdev(0x55d16f83d000 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 20 GiB
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluefs mount
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluefs mount shared_bdev_used = 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: RocksDB version: 7.9.2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Git sha 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Compile date 2025-07-17 03:12:14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: DB SUMMARY
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: DB Session ID:  GDOX11IZ247TJWEOACQM
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: CURRENT file:  CURRENT
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: IDENTITY file:  IDENTITY
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                         Options.error_if_exists: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.create_if_missing: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                         Options.paranoid_checks: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                                     Options.env: 0x55d16f80ddc0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                                Options.info_log: 0x55d16f8117a0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.max_file_opening_threads: 16
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                              Options.statistics: (nil)
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.use_fsync: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.max_log_file_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                         Options.allow_fallocate: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.use_direct_reads: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.create_missing_column_families: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                              Options.db_log_dir: 
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                                 Options.wal_dir: db.wal
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.advise_random_on_open: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.write_buffer_manager: 0x55d16f908a00
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                            Options.rate_limiter: (nil)
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.unordered_write: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.row_cache: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                              Options.wal_filter: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.allow_ingest_behind: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.two_write_queues: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.manual_wal_flush: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.wal_compression: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.atomic_flush: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.log_readahead_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.allow_data_in_errors: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.db_host_id: __hostname__
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.max_background_jobs: 4
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.max_background_compactions: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.max_subcompactions: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.max_open_files: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.bytes_per_sync: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.max_background_flushes: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Compression algorithms supported:
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         kZSTD supported: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         kXpressCompression supported: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         kBZip2Compression supported: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         kZSTDNotFinalCompression supported: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         kLZ4Compression supported: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         kZlibCompression supported: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         kLZ4HCCompression supported: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         kSnappyCompression supported: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55d16ea37350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55d16ea37350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55d16ea37350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55d16ea37350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55d16ea37350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55d16ea37350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55d16ea37350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811b80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55d16ea369b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811b80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55d16ea369b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811b80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55d16ea369b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 0b9759fc-282b-40dc-a20d-ecaf65ebba52
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420421636109, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420421636410, "job": 1, "event": "recovery_finished"}
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta old nid_max 1025
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta old blobid_max 10240
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta min_alloc_size 0x1000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: freelist init
Jan 26 09:40:21 compute-1 ceph-osd[77632]: freelist _read_cfg
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluefs umount
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bdev(0x55d16f83d000 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bdev(0x55d16f83d000 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bdev(0x55d16f83d000 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bdev(0x55d16f83d000 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bdev(0x55d16f83d000 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 20 GiB
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluefs mount
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluefs mount shared_bdev_used = 4718592
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: RocksDB version: 7.9.2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Git sha 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Compile date 2025-07-17 03:12:14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: DB SUMMARY
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: DB Session ID:  GDOX11IZ247TJWEOACQN
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: CURRENT file:  CURRENT
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: IDENTITY file:  IDENTITY
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                         Options.error_if_exists: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.create_if_missing: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                         Options.paranoid_checks: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                                     Options.env: 0x55d16f9ac2a0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                                Options.info_log: 0x55d16f811940
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.max_file_opening_threads: 16
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                              Options.statistics: (nil)
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.use_fsync: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.max_log_file_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                         Options.allow_fallocate: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.use_direct_reads: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.create_missing_column_families: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                              Options.db_log_dir: 
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                                 Options.wal_dir: db.wal
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.advise_random_on_open: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.write_buffer_manager: 0x55d16f908a00
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                            Options.rate_limiter: (nil)
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.unordered_write: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.row_cache: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                              Options.wal_filter: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.allow_ingest_behind: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.two_write_queues: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.manual_wal_flush: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.wal_compression: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.atomic_flush: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.log_readahead_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.allow_data_in_errors: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.db_host_id: __hostname__
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.max_background_jobs: 4
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.max_background_compactions: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.max_subcompactions: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.max_open_files: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.bytes_per_sync: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.max_background_flushes: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Compression algorithms supported:
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         kZSTD supported: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         kXpressCompression supported: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         kBZip2Compression supported: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         kZSTDNotFinalCompression supported: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         kLZ4Compression supported: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         kZlibCompression supported: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         kLZ4HCCompression supported: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         kSnappyCompression supported: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55d16ea37350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55d16ea37350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55d16ea37350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55d16ea37350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55d16ea37350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55d16ea37350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55d16ea37350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811ac0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55d16ea369b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811ac0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55d16ea369b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811ac0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55d16ea369b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 0b9759fc-282b-40dc-a20d-ecaf65ebba52
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420421883439, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420421888981, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420421, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0b9759fc-282b-40dc-a20d-ecaf65ebba52", "db_session_id": "GDOX11IZ247TJWEOACQN", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420421892035, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1595, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 469, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 571, "raw_average_value_size": 285, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420421, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0b9759fc-282b-40dc-a20d-ecaf65ebba52", "db_session_id": "GDOX11IZ247TJWEOACQN", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420421895337, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420421, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0b9759fc-282b-40dc-a20d-ecaf65ebba52", "db_session_id": "GDOX11IZ247TJWEOACQN", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420421897133, "job": 1, "event": "recovery_finished"}
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55d16f9d8000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: DB pointer 0x55d16f9b8000
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _upgrade_super from 4, latest 4
Jan 26 09:40:21 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _upgrade_super done
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 09:40:21 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea369b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea369b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea369b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Jan 26 09:40:21 compute-1 ceph-osd[77632]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Jan 26 09:40:21 compute-1 ceph-osd[77632]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/hello/cls_hello.cc:316: loading cls_hello
Jan 26 09:40:21 compute-1 ceph-osd[77632]: _get_class not permitted to load lua
Jan 26 09:40:21 compute-1 ceph-osd[77632]: _get_class not permitted to load sdk
Jan 26 09:40:21 compute-1 ceph-osd[77632]: osd.1 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Jan 26 09:40:21 compute-1 ceph-osd[77632]: osd.1 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Jan 26 09:40:21 compute-1 ceph-osd[77632]: osd.1 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Jan 26 09:40:21 compute-1 ceph-osd[77632]: osd.1 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Jan 26 09:40:21 compute-1 ceph-osd[77632]: osd.1 0 load_pgs
Jan 26 09:40:21 compute-1 ceph-osd[77632]: osd.1 0 load_pgs opened 0 pgs
Jan 26 09:40:21 compute-1 ceph-osd[77632]: osd.1 0 log_to_monitors true
Jan 26 09:40:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1[77628]: 2026-01-26T09:40:21.928+0000 7fad5994c740 -1 osd.1 0 log_to_monitors true
Jan 26 09:40:22 compute-1 sudo[78314]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 09:40:22 compute-1 sudo[78314]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:22 compute-1 sudo[78314]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:22 compute-1 sudo[78339]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:40:22 compute-1 sudo[78339]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:22 compute-1 sudo[78339]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:22 compute-1 sudo[78364]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 26 09:40:22 compute-1 sudo[78364]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:22 compute-1 podman[78462]: 2026-01-26 09:40:22.741738877 +0000 UTC m=+0.066424739 container exec 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Jan 26 09:40:22 compute-1 podman[78462]: 2026-01-26 09:40:22.848239439 +0000 UTC m=+0.172925301 container exec_died 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.40.1)
Jan 26 09:40:22 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Jan 26 09:40:22 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Jan 26 09:40:22 compute-1 sudo[78364]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:23 compute-1 sudo[78512]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:40:23 compute-1 sudo[78512]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:23 compute-1 sudo[78512]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:23 compute-1 sudo[78537]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 09:40:23 compute-1 sudo[78537]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:23 compute-1 sudo[78537]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:23 compute-1 sudo[78593]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:40:23 compute-1 sudo[78593]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:23 compute-1 sudo[78593]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:23 compute-1 sudo[78618]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 1a70b85d-e3fd-5814-8a6a-37ea00fcae30 -- inventory --format=json-pretty --filter-for-batch
Jan 26 09:40:23 compute-1 sudo[78618]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:40:24 compute-1 ceph-osd[77632]: osd.1 0 done with init, starting boot process
Jan 26 09:40:24 compute-1 ceph-osd[77632]: osd.1 0 start_boot
Jan 26 09:40:24 compute-1 ceph-osd[77632]: osd.1 0 maybe_override_options_for_qos osd_max_backfills set to 1
Jan 26 09:40:24 compute-1 ceph-osd[77632]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Jan 26 09:40:24 compute-1 ceph-osd[77632]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Jan 26 09:40:24 compute-1 ceph-osd[77632]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Jan 26 09:40:24 compute-1 ceph-osd[77632]: osd.1 0  bench count 12288000 bsize 4 KiB
Jan 26 09:40:24 compute-1 podman[78684]: 2026-01-26 09:40:24.219778748 +0000 UTC m=+0.056018311 container create 7557bcb0adb3e119edb8d915ceea666e854084fbd72d79f2d97052dbbeb3e992 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_heyrovsky, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Jan 26 09:40:24 compute-1 systemd[1]: Started libpod-conmon-7557bcb0adb3e119edb8d915ceea666e854084fbd72d79f2d97052dbbeb3e992.scope.
Jan 26 09:40:24 compute-1 podman[78684]: 2026-01-26 09:40:24.188318009 +0000 UTC m=+0.024557602 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:40:24 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:40:25 compute-1 podman[78684]: 2026-01-26 09:40:25.036058505 +0000 UTC m=+0.872298178 container init 7557bcb0adb3e119edb8d915ceea666e854084fbd72d79f2d97052dbbeb3e992 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_heyrovsky, OSD_FLAVOR=default, io.buildah.version=1.40.1, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 09:40:25 compute-1 podman[78684]: 2026-01-26 09:40:25.048770288 +0000 UTC m=+0.885009851 container start 7557bcb0adb3e119edb8d915ceea666e854084fbd72d79f2d97052dbbeb3e992 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_heyrovsky, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Jan 26 09:40:25 compute-1 silly_heyrovsky[78701]: 167 167
Jan 26 09:40:25 compute-1 systemd[1]: libpod-7557bcb0adb3e119edb8d915ceea666e854084fbd72d79f2d97052dbbeb3e992.scope: Deactivated successfully.
Jan 26 09:40:25 compute-1 podman[78684]: 2026-01-26 09:40:25.070181959 +0000 UTC m=+0.906421542 container attach 7557bcb0adb3e119edb8d915ceea666e854084fbd72d79f2d97052dbbeb3e992 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_heyrovsky, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 26 09:40:25 compute-1 podman[78684]: 2026-01-26 09:40:25.071528988 +0000 UTC m=+0.907768551 container died 7557bcb0adb3e119edb8d915ceea666e854084fbd72d79f2d97052dbbeb3e992 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_heyrovsky, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Jan 26 09:40:25 compute-1 systemd[1]: var-lib-containers-storage-overlay-bc29a694281a3ce8f96c632fb35c2eed8c3cfe5635785b528ae0a08ae271faee-merged.mount: Deactivated successfully.
Jan 26 09:40:25 compute-1 podman[78684]: 2026-01-26 09:40:25.189141258 +0000 UTC m=+1.025380821 container remove 7557bcb0adb3e119edb8d915ceea666e854084fbd72d79f2d97052dbbeb3e992 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_heyrovsky, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid)
Jan 26 09:40:25 compute-1 systemd[1]: libpod-conmon-7557bcb0adb3e119edb8d915ceea666e854084fbd72d79f2d97052dbbeb3e992.scope: Deactivated successfully.
Jan 26 09:40:25 compute-1 podman[78730]: 2026-01-26 09:40:25.357144957 +0000 UTC m=+0.054641073 container create 257ce75e425f1d24a6afb55e29044312264c56830c9593b453072cdc645cdea9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_maxwell, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 09:40:25 compute-1 systemd[1]: Started libpod-conmon-257ce75e425f1d24a6afb55e29044312264c56830c9593b453072cdc645cdea9.scope.
Jan 26 09:40:25 compute-1 podman[78730]: 2026-01-26 09:40:25.326149251 +0000 UTC m=+0.023645267 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:40:25 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:40:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4de7a816c009eca28a69e0829b257d3277588570b1ed8e2b1efb8419d536b509/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4de7a816c009eca28a69e0829b257d3277588570b1ed8e2b1efb8419d536b509/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4de7a816c009eca28a69e0829b257d3277588570b1ed8e2b1efb8419d536b509/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4de7a816c009eca28a69e0829b257d3277588570b1ed8e2b1efb8419d536b509/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 09:40:25 compute-1 podman[78730]: 2026-01-26 09:40:25.462499585 +0000 UTC m=+0.159995591 container init 257ce75e425f1d24a6afb55e29044312264c56830c9593b453072cdc645cdea9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_maxwell, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Jan 26 09:40:25 compute-1 podman[78730]: 2026-01-26 09:40:25.468478346 +0000 UTC m=+0.165974342 container start 257ce75e425f1d24a6afb55e29044312264c56830c9593b453072cdc645cdea9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_maxwell, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 09:40:25 compute-1 podman[78730]: 2026-01-26 09:40:25.488923281 +0000 UTC m=+0.186419277 container attach 257ce75e425f1d24a6afb55e29044312264c56830c9593b453072cdc645cdea9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_maxwell, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1)
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]: [
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:     {
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:         "available": false,
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:         "being_replaced": false,
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:         "ceph_device_lvm": false,
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:         "device_id": "QEMU_DVD-ROM_QM00001",
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:         "lsm_data": {},
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:         "lvs": [],
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:         "path": "/dev/sr0",
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:         "rejected_reasons": [
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             "Insufficient space (<5GB)",
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             "Has a FileSystem"
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:         ],
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:         "sys_api": {
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             "actuators": null,
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             "device_nodes": [
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:                 "sr0"
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             ],
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             "devname": "sr0",
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             "human_readable_size": "482.00 KB",
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             "id_bus": "ata",
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             "model": "QEMU DVD-ROM",
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             "nr_requests": "2",
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             "parent": "/dev/sr0",
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             "partitions": {},
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             "path": "/dev/sr0",
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             "removable": "1",
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             "rev": "2.5+",
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             "ro": "0",
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             "rotational": "1",
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             "sas_address": "",
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             "sas_device_handle": "",
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             "scheduler_mode": "mq-deadline",
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             "sectors": 0,
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             "sectorsize": "2048",
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             "size": 493568.0,
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             "support_discard": "2048",
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             "type": "disk",
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:             "vendor": "QEMU"
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:         }
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]:     }
Jan 26 09:40:26 compute-1 heuristic_maxwell[78747]: ]
Jan 26 09:40:26 compute-1 systemd[1]: libpod-257ce75e425f1d24a6afb55e29044312264c56830c9593b453072cdc645cdea9.scope: Deactivated successfully.
Jan 26 09:40:26 compute-1 podman[78730]: 2026-01-26 09:40:26.151857467 +0000 UTC m=+0.849353473 container died 257ce75e425f1d24a6afb55e29044312264c56830c9593b453072cdc645cdea9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_maxwell, org.label-schema.build-date=20250325, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 09:40:26 compute-1 systemd[1]: var-lib-containers-storage-overlay-4de7a816c009eca28a69e0829b257d3277588570b1ed8e2b1efb8419d536b509-merged.mount: Deactivated successfully.
Jan 26 09:40:26 compute-1 podman[78730]: 2026-01-26 09:40:26.264170386 +0000 UTC m=+0.961666382 container remove 257ce75e425f1d24a6afb55e29044312264c56830c9593b453072cdc645cdea9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_maxwell, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 09:40:26 compute-1 systemd[1]: libpod-conmon-257ce75e425f1d24a6afb55e29044312264c56830c9593b453072cdc645cdea9.scope: Deactivated successfully.
Jan 26 09:40:26 compute-1 sudo[78618]: pam_unix(sudo:session): session closed for user root
Jan 26 09:40:28 compute-1 ceph-osd[77632]: osd.1 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 27.506 iops: 7041.512 elapsed_sec: 0.426
Jan 26 09:40:28 compute-1 ceph-osd[77632]: log_channel(cluster) log [WRN] : OSD bench result of 7041.512344 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.1. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 26 09:40:28 compute-1 ceph-osd[77632]: osd.1 0 waiting for initial osdmap
Jan 26 09:40:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1[77628]: 2026-01-26T09:40:28.344+0000 7fad560e2640 -1 osd.1 0 waiting for initial osdmap
Jan 26 09:40:28 compute-1 ceph-osd[77632]: osd.1 8 crush map has features 288514050185494528, adjusting msgr requires for clients
Jan 26 09:40:28 compute-1 ceph-osd[77632]: osd.1 8 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons
Jan 26 09:40:28 compute-1 ceph-osd[77632]: osd.1 8 crush map has features 3314932999778484224, adjusting msgr requires for osds
Jan 26 09:40:28 compute-1 ceph-osd[77632]: osd.1 8 check_osdmap_features require_osd_release unknown -> squid
Jan 26 09:40:28 compute-1 ceph-osd[77632]: osd.1 8 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 26 09:40:28 compute-1 ceph-osd[77632]: osd.1 8 set_numa_affinity not setting numa affinity
Jan 26 09:40:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1[77628]: 2026-01-26T09:40:28.367+0000 7fad50ef7640 -1 osd.1 8 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 26 09:40:28 compute-1 ceph-osd[77632]: osd.1 8 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial no unique device path for loop3: no symlink to loop3 in /dev/disk/by-path
Jan 26 09:40:28 compute-1 ceph-osd[77632]: osd.1 9 state: booting -> active
Jan 26 09:40:31 compute-1 ceph-osd[77632]: osd.1 10 crush map has features 288514051259236352, adjusting msgr requires for clients
Jan 26 09:40:31 compute-1 ceph-osd[77632]: osd.1 10 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons
Jan 26 09:40:31 compute-1 ceph-osd[77632]: osd.1 10 crush map has features 3314933000852226048, adjusting msgr requires for osds
Jan 26 09:40:31 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 10 pg[1.0( empty local-lis/les=0/0 n=0 ec=10/10 lis/c=0/0 les/c/f=0/0/0 sis=10) [1] r=0 lpr=10 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:40:31 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 11 pg[1.0( empty local-lis/les=10/11 n=0 ec=10/10 lis/c=0/0 les/c/f=0/0/0 sis=10) [1] r=0 lpr=10 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:40:33 compute-1 sshd-session[79778]: Connection closed by authenticating user root 152.42.136.110 port 45704 [preauth]
Jan 26 09:40:41 compute-1 sshd-session[79780]: Invalid user admin from 104.248.198.71 port 52228
Jan 26 09:40:42 compute-1 sshd-session[79780]: Connection closed by invalid user admin 104.248.198.71 port 52228 [preauth]
Jan 26 09:40:55 compute-1 sshd-session[79782]: Connection closed by authenticating user root 178.62.249.31 port 42724 [preauth]
Jan 26 09:41:03 compute-1 sudo[79784]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:41:03 compute-1 sudo[79784]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:03 compute-1 sudo[79784]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:03 compute-1 sudo[79809]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:41:03 compute-1 sudo[79809]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:04 compute-1 podman[79874]: 2026-01-26 09:41:04.255551518 +0000 UTC m=+0.057986854 container create f4dceb8e41c583fa6d4e1667a966e6de2f12d7ac363b290d4b2134f9914df149 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=interesting_morse, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Jan 26 09:41:04 compute-1 systemd[1]: Started libpod-conmon-f4dceb8e41c583fa6d4e1667a966e6de2f12d7ac363b290d4b2134f9914df149.scope.
Jan 26 09:41:04 compute-1 podman[79874]: 2026-01-26 09:41:04.222874966 +0000 UTC m=+0.025310392 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:41:04 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:41:04 compute-1 podman[79874]: 2026-01-26 09:41:04.3380223 +0000 UTC m=+0.140457656 container init f4dceb8e41c583fa6d4e1667a966e6de2f12d7ac363b290d4b2134f9914df149 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=interesting_morse, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=squid)
Jan 26 09:41:04 compute-1 podman[79874]: 2026-01-26 09:41:04.344115526 +0000 UTC m=+0.146550862 container start f4dceb8e41c583fa6d4e1667a966e6de2f12d7ac363b290d4b2134f9914df149 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=interesting_morse, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Jan 26 09:41:04 compute-1 podman[79874]: 2026-01-26 09:41:04.346622035 +0000 UTC m=+0.149057371 container attach f4dceb8e41c583fa6d4e1667a966e6de2f12d7ac363b290d4b2134f9914df149 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=interesting_morse, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Jan 26 09:41:04 compute-1 interesting_morse[79890]: 167 167
Jan 26 09:41:04 compute-1 systemd[1]: libpod-f4dceb8e41c583fa6d4e1667a966e6de2f12d7ac363b290d4b2134f9914df149.scope: Deactivated successfully.
Jan 26 09:41:04 compute-1 podman[79874]: 2026-01-26 09:41:04.350255534 +0000 UTC m=+0.152690870 container died f4dceb8e41c583fa6d4e1667a966e6de2f12d7ac363b290d4b2134f9914df149 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=interesting_morse, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True)
Jan 26 09:41:04 compute-1 systemd[1]: var-lib-containers-storage-overlay-c4f5786995e166730047d73ab55ff7fc8f5a8e3ead4465b53d811eec2110bbae-merged.mount: Deactivated successfully.
Jan 26 09:41:04 compute-1 podman[79874]: 2026-01-26 09:41:04.385499556 +0000 UTC m=+0.187934902 container remove f4dceb8e41c583fa6d4e1667a966e6de2f12d7ac363b290d4b2134f9914df149 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=interesting_morse, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Jan 26 09:41:04 compute-1 systemd[1]: libpod-conmon-f4dceb8e41c583fa6d4e1667a966e6de2f12d7ac363b290d4b2134f9914df149.scope: Deactivated successfully.
Jan 26 09:41:04 compute-1 podman[79908]: 2026-01-26 09:41:04.450763848 +0000 UTC m=+0.043559211 container create f2a20f0bab75e1bc312172474ced89ee318792eb49262d2dd46deb1b21807bad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_tharp, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 09:41:04 compute-1 systemd[1]: Started libpod-conmon-f2a20f0bab75e1bc312172474ced89ee318792eb49262d2dd46deb1b21807bad.scope.
Jan 26 09:41:04 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:41:04 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/826be8ffc87ec79a65fbc260f07deaaaefefd91c9203b84aeb29d9a684a5556e/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 09:41:04 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/826be8ffc87ec79a65fbc260f07deaaaefefd91c9203b84aeb29d9a684a5556e/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Jan 26 09:41:04 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/826be8ffc87ec79a65fbc260f07deaaaefefd91c9203b84aeb29d9a684a5556e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 09:41:04 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/826be8ffc87ec79a65fbc260f07deaaaefefd91c9203b84aeb29d9a684a5556e/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Jan 26 09:41:04 compute-1 podman[79908]: 2026-01-26 09:41:04.514683433 +0000 UTC m=+0.107478796 container init f2a20f0bab75e1bc312172474ced89ee318792eb49262d2dd46deb1b21807bad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_tharp, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 09:41:04 compute-1 podman[79908]: 2026-01-26 09:41:04.520598844 +0000 UTC m=+0.113394207 container start f2a20f0bab75e1bc312172474ced89ee318792eb49262d2dd46deb1b21807bad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_tharp, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True)
Jan 26 09:41:04 compute-1 podman[79908]: 2026-01-26 09:41:04.523454552 +0000 UTC m=+0.116249935 container attach f2a20f0bab75e1bc312172474ced89ee318792eb49262d2dd46deb1b21807bad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_tharp, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Jan 26 09:41:04 compute-1 podman[79908]: 2026-01-26 09:41:04.43254533 +0000 UTC m=+0.025340723 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:41:04 compute-1 systemd[1]: libpod-f2a20f0bab75e1bc312172474ced89ee318792eb49262d2dd46deb1b21807bad.scope: Deactivated successfully.
Jan 26 09:41:04 compute-1 podman[79908]: 2026-01-26 09:41:04.618897947 +0000 UTC m=+0.211693330 container died f2a20f0bab75e1bc312172474ced89ee318792eb49262d2dd46deb1b21807bad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_tharp, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 09:41:04 compute-1 systemd[1]: var-lib-containers-storage-overlay-826be8ffc87ec79a65fbc260f07deaaaefefd91c9203b84aeb29d9a684a5556e-merged.mount: Deactivated successfully.
Jan 26 09:41:04 compute-1 podman[79908]: 2026-01-26 09:41:04.650225492 +0000 UTC m=+0.243020855 container remove f2a20f0bab75e1bc312172474ced89ee318792eb49262d2dd46deb1b21807bad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_tharp, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325)
Jan 26 09:41:04 compute-1 systemd[1]: libpod-conmon-f2a20f0bab75e1bc312172474ced89ee318792eb49262d2dd46deb1b21807bad.scope: Deactivated successfully.
Jan 26 09:41:04 compute-1 systemd[1]: Reloading.
Jan 26 09:41:04 compute-1 systemd-rc-local-generator[79984]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:41:04 compute-1 systemd-sysv-generator[79988]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:41:04 compute-1 systemd[1]: Reloading.
Jan 26 09:41:05 compute-1 systemd-rc-local-generator[80030]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:41:05 compute-1 systemd-sysv-generator[80034]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:41:05 compute-1 systemd[1]: Starting Ceph mon.compute-1 for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 09:41:05 compute-1 podman[80087]: 2026-01-26 09:41:05.452736381 +0000 UTC m=+0.041458703 container create 0913a1c63c0c3a30499c5537209d0784e72b8023efea53f6d17a9d4102eb8fe2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mon-compute-1, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:41:05 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f4b4a876e7c967071e40ff8fe7a84132314fd625a16463acebae90d669257e8/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 09:41:05 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f4b4a876e7c967071e40ff8fe7a84132314fd625a16463acebae90d669257e8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 09:41:05 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f4b4a876e7c967071e40ff8fe7a84132314fd625a16463acebae90d669257e8/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 09:41:05 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f4b4a876e7c967071e40ff8fe7a84132314fd625a16463acebae90d669257e8/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Jan 26 09:41:05 compute-1 podman[80087]: 2026-01-26 09:41:05.519487252 +0000 UTC m=+0.108209594 container init 0913a1c63c0c3a30499c5537209d0784e72b8023efea53f6d17a9d4102eb8fe2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mon-compute-1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 09:41:05 compute-1 podman[80087]: 2026-01-26 09:41:05.525025553 +0000 UTC m=+0.113747875 container start 0913a1c63c0c3a30499c5537209d0784e72b8023efea53f6d17a9d4102eb8fe2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mon-compute-1, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 09:41:05 compute-1 bash[80087]: 0913a1c63c0c3a30499c5537209d0784e72b8023efea53f6d17a9d4102eb8fe2
Jan 26 09:41:05 compute-1 podman[80087]: 2026-01-26 09:41:05.433553937 +0000 UTC m=+0.022276279 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:41:05 compute-1 systemd[1]: Started Ceph mon.compute-1 for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:41:05 compute-1 ceph-mon[80107]: set uid:gid to 167:167 (ceph:ceph)
Jan 26 09:41:05 compute-1 ceph-mon[80107]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mon, pid 2
Jan 26 09:41:05 compute-1 ceph-mon[80107]: pidfile_write: ignore empty --pid-file
Jan 26 09:41:05 compute-1 ceph-mon[80107]: load: jerasure load: lrc 
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: RocksDB version: 7.9.2
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: Git sha 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: Compile date 2025-07-17 03:12:14
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: DB SUMMARY
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: DB Session ID:  OSRMBNXDC8EXU3R2EA69
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: CURRENT file:  CURRENT
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: IDENTITY file:  IDENTITY
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-1/store.db dir, Total Num: 0, files: 
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-1/store.db: 000004.log size: 636 ; 
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                         Options.error_if_exists: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                       Options.create_if_missing: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                         Options.paranoid_checks: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                                     Options.env: 0x55af2b350c20
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                                      Options.fs: PosixFileSystem
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                                Options.info_log: 0x55af2cafda20
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                Options.max_file_opening_threads: 16
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                              Options.statistics: (nil)
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                               Options.use_fsync: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                       Options.max_log_file_size: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                         Options.allow_fallocate: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                        Options.use_direct_reads: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:          Options.create_missing_column_families: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                              Options.db_log_dir: 
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                                 Options.wal_dir: 
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                   Options.advise_random_on_open: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                    Options.write_buffer_manager: 0x55af2cb01900
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                            Options.rate_limiter: (nil)
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                  Options.unordered_write: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                               Options.row_cache: None
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                              Options.wal_filter: None
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:             Options.allow_ingest_behind: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:             Options.two_write_queues: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:             Options.manual_wal_flush: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:             Options.wal_compression: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:             Options.atomic_flush: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                 Options.log_readahead_size: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:             Options.allow_data_in_errors: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:             Options.db_host_id: __hostname__
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:             Options.max_background_jobs: 2
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:             Options.max_background_compactions: -1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:             Options.max_subcompactions: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:             Options.max_total_wal_size: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                          Options.max_open_files: -1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                          Options.bytes_per_sync: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:       Options.compaction_readahead_size: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                  Options.max_background_flushes: -1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: Compression algorithms supported:
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:         kZSTD supported: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:         kXpressCompression supported: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:         kBZip2Compression supported: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:         kZSTDNotFinalCompression supported: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:         kLZ4Compression supported: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:         kZlibCompression supported: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:         kLZ4HCCompression supported: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:         kSnappyCompression supported: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:           Options.merge_operator: 
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:        Options.compaction_filter: None
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55af2cafd6a0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55af2cb209b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:        Options.write_buffer_size: 33554432
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:  Options.max_write_buffer_number: 2
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:          Options.compression: NoCompression
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:             Options.num_levels: 7
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                           Options.bloom_locality: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                               Options.ttl: 2592000
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                       Options.enable_blob_files: false
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                           Options.min_blob_size: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 4e64dd99-608f-448d-a4f8-af05bb4d42d8
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420465572110, "job": 1, "event": "recovery_started", "wal_files": [4]}
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420465574602, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1773, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 648, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 526, "raw_average_value_size": 105, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420465574753, "job": 1, "event": "recovery_finished"}
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55af2cb22e00
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: DB pointer 0x55af2cb32000
Jan 26 09:41:05 compute-1 ceph-mon[80107]: mon.compute-1 does not exist in monmap, will attempt to join an existing cluster
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 09:41:05 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.73 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.73 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55af2cb209b0#2 capacity: 512.00 MB usage: 0.98 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 5.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.77 KB,0.000146031%) FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Jan 26 09:41:05 compute-1 ceph-mon[80107]: using public_addr v2:192.168.122.101:0/0 -> [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0]
Jan 26 09:41:05 compute-1 ceph-mon[80107]: starting mon.compute-1 rank -1 at public addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] at bind addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-1 fsid 1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:41:05 compute-1 ceph-mon[80107]: mon.compute-1@-1(???) e0 preinit fsid 1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:41:05 compute-1 sudo[79809]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:05 compute-1 ceph-mon[80107]: mon.compute-1@-1(synchronizing).mds e1 new map
Jan 26 09:41:05 compute-1 ceph-mon[80107]: mon.compute-1@-1(synchronizing).mds e1 print_map
                                           e1
                                           btime 2026-01-26T09:38:21:975599+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: -1
                                            
                                           No filesystems configured
Jan 26 09:41:05 compute-1 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Jan 26 09:41:05 compute-1 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Jan 26 09:41:05 compute-1 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Jan 26 09:41:05 compute-1 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Jan 26 09:41:05 compute-1 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Jan 26 09:41:05 compute-1 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Jan 26 09:41:05 compute-1 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Jan 26 09:41:05 compute-1 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Jan 26 09:41:05 compute-1 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Jan 26 09:41:05 compute-1 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e8 e8: 2 total, 1 up, 2 in
Jan 26 09:41:05 compute-1 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e9 e9: 2 total, 2 up, 2 in
Jan 26 09:41:05 compute-1 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e10 e10: 2 total, 2 up, 2 in
Jan 26 09:41:05 compute-1 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e11 e11: 2 total, 2 up, 2 in
Jan 26 09:41:05 compute-1 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e12 e12: 2 total, 2 up, 2 in
Jan 26 09:41:05 compute-1 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e12 crush map has features 3314933000852226048, adjusting msgr requires
Jan 26 09:41:05 compute-1 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e12 crush map has features 288514051259236352, adjusting msgr requires
Jan 26 09:41:05 compute-1 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e12 crush map has features 288514051259236352, adjusting msgr requires
Jan 26 09:41:05 compute-1 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e12 crush map has features 288514051259236352, adjusting msgr requires
Jan 26 09:41:05 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 26 09:41:05 compute-1 ceph-mon[80107]: mon.compute-1@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Jan 26 09:41:07 compute-1 ceph-mon[80107]: mon.compute-1@-1(probing) e3  my rank is now 2 (was -1)
Jan 26 09:41:07 compute-1 ceph-mon[80107]: log_channel(cluster) log [INF] : mon.compute-1 calling monitor election
Jan 26 09:41:07 compute-1 ceph-mon[80107]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 
Jan 26 09:41:07 compute-1 ceph-mon[80107]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 26 09:41:10 compute-1 ceph-mon[80107]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 26 09:41:10 compute-1 ceph-mon[80107]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 26 09:41:10 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Jan 26 09:41:10 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout,16=squid ondisk layout}
Jan 26 09:41:10 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 26 09:41:10 compute-1 ceph-mon[80107]: mgrc update_daemon_metadata mon.compute-1 metadata {addrs=[v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0],arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-1,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026,kernel_version=5.14.0-661.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864304,os=Linux}
Jan 26 09:41:10 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_auth_request failed to assign global_id
Jan 26 09:41:11 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e13 e13: 2 total, 2 up, 2 in
Jan 26 09:41:11 compute-1 ceph-mon[80107]: pgmap v75: 1 pgs: 1 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 26 09:41:11 compute-1 ceph-mon[80107]: Deploying daemon mgr.compute-2.oynaeu on compute-2
Jan 26 09:41:11 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3981712437' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 26 09:41:11 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Jan 26 09:41:11 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 26 09:41:11 compute-1 ceph-mon[80107]: mon.compute-0 calling monitor election
Jan 26 09:41:11 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 26 09:41:11 compute-1 ceph-mon[80107]: mon.compute-2 calling monitor election
Jan 26 09:41:11 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 26 09:41:11 compute-1 ceph-mon[80107]: pgmap v76: 1 pgs: 1 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 26 09:41:11 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 26 09:41:11 compute-1 ceph-mon[80107]: mon.compute-1 calling monitor election
Jan 26 09:41:11 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 26 09:41:11 compute-1 ceph-mon[80107]: pgmap v77: 1 pgs: 1 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 26 09:41:11 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 26 09:41:11 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 26 09:41:11 compute-1 ceph-mon[80107]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Jan 26 09:41:11 compute-1 ceph-mon[80107]: monmap epoch 3
Jan 26 09:41:11 compute-1 ceph-mon[80107]: fsid 1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:41:11 compute-1 ceph-mon[80107]: last_changed 2026-01-26T09:41:05.675064+0000
Jan 26 09:41:11 compute-1 ceph-mon[80107]: created 2026-01-26T09:38:19.068625+0000
Jan 26 09:41:11 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 13 pg[2.0( empty local-lis/les=0/0 n=0 ec=13/13 lis/c=0/0 les/c/f=0/0/0 sis=13) [1] r=0 lpr=13 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:11 compute-1 sudo[80146]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:41:11 compute-1 sudo[80146]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:11 compute-1 sudo[80146]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:11 compute-1 sudo[80171]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:41:11 compute-1 sudo[80171]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:11 compute-1 podman[80237]: 2026-01-26 09:41:11.743990975 +0000 UTC m=+0.043257962 container create c26e552561eee9962ccbb140b8a7d3b49ec454284392c55d2ce70be6f0c3f39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_matsumoto, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 09:41:11 compute-1 systemd[1]: Started libpod-conmon-c26e552561eee9962ccbb140b8a7d3b49ec454284392c55d2ce70be6f0c3f39c.scope.
Jan 26 09:41:11 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:41:11 compute-1 podman[80237]: 2026-01-26 09:41:11.724481392 +0000 UTC m=+0.023748399 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:41:11 compute-1 podman[80237]: 2026-01-26 09:41:11.82621106 +0000 UTC m=+0.125478067 container init c26e552561eee9962ccbb140b8a7d3b49ec454284392c55d2ce70be6f0c3f39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_matsumoto, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, ceph=True)
Jan 26 09:41:11 compute-1 podman[80237]: 2026-01-26 09:41:11.834104934 +0000 UTC m=+0.133371921 container start c26e552561eee9962ccbb140b8a7d3b49ec454284392c55d2ce70be6f0c3f39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_matsumoto, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 09:41:11 compute-1 podman[80237]: 2026-01-26 09:41:11.838221477 +0000 UTC m=+0.137488494 container attach c26e552561eee9962ccbb140b8a7d3b49ec454284392c55d2ce70be6f0c3f39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_matsumoto, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Jan 26 09:41:11 compute-1 upbeat_matsumoto[80254]: 167 167
Jan 26 09:41:11 compute-1 systemd[1]: libpod-c26e552561eee9962ccbb140b8a7d3b49ec454284392c55d2ce70be6f0c3f39c.scope: Deactivated successfully.
Jan 26 09:41:11 compute-1 podman[80237]: 2026-01-26 09:41:11.840802658 +0000 UTC m=+0.140069675 container died c26e552561eee9962ccbb140b8a7d3b49ec454284392c55d2ce70be6f0c3f39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_matsumoto, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 09:41:11 compute-1 systemd[1]: var-lib-containers-storage-overlay-ab8936b293513bdd83643f1cb105d585fdca2440dfb7f9630c13e584ac026faa-merged.mount: Deactivated successfully.
Jan 26 09:41:11 compute-1 podman[80237]: 2026-01-26 09:41:11.878439455 +0000 UTC m=+0.177706442 container remove c26e552561eee9962ccbb140b8a7d3b49ec454284392c55d2ce70be6f0c3f39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_matsumoto, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 09:41:11 compute-1 systemd[1]: libpod-conmon-c26e552561eee9962ccbb140b8a7d3b49ec454284392c55d2ce70be6f0c3f39c.scope: Deactivated successfully.
Jan 26 09:41:11 compute-1 systemd[1]: Reloading.
Jan 26 09:41:11 compute-1 systemd-rc-local-generator[80292]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:41:11 compute-1 systemd-sysv-generator[80298]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:41:12 compute-1 systemd[1]: Reloading.
Jan 26 09:41:12 compute-1 systemd-rc-local-generator[80337]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:41:12 compute-1 systemd-sysv-generator[80340]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:41:12 compute-1 ceph-mon[80107]: min_mon_release 19 (squid)
Jan 26 09:41:12 compute-1 ceph-mon[80107]: election_strategy: 1
Jan 26 09:41:12 compute-1 ceph-mon[80107]: 0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.compute-0
Jan 26 09:41:12 compute-1 ceph-mon[80107]: 1: [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon.compute-2
Jan 26 09:41:12 compute-1 ceph-mon[80107]: 2: [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon.compute-1
Jan 26 09:41:12 compute-1 ceph-mon[80107]: fsmap 
Jan 26 09:41:12 compute-1 ceph-mon[80107]: osdmap e12: 2 total, 2 up, 2 in
Jan 26 09:41:12 compute-1 ceph-mon[80107]: mgrmap e9: compute-0.zllcia(active, since 2m)
Jan 26 09:41:12 compute-1 ceph-mon[80107]: overall HEALTH_OK
Jan 26 09:41:12 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:12 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3981712437' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 26 09:41:12 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:12 compute-1 ceph-mon[80107]: osdmap e13: 2 total, 2 up, 2 in
Jan 26 09:41:12 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:12 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:12 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.xammti", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 26 09:41:12 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.xammti", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Jan 26 09:41:12 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 26 09:41:12 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:41:12 compute-1 ceph-mon[80107]: Deploying daemon mgr.compute-1.xammti on compute-1
Jan 26 09:41:12 compute-1 ceph-mon[80107]: pgmap v79: 2 pgs: 1 unknown, 1 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 26 09:41:12 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 26 09:41:12 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3023141661' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 26 09:41:12 compute-1 systemd[1]: Starting Ceph mgr.compute-1.xammti for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 09:41:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e14 e14: 2 total, 2 up, 2 in
Jan 26 09:41:12 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 14 pg[2.0( empty local-lis/les=13/14 n=0 ec=13/13 lis/c=0/0 les/c/f=0/0/0 sis=13) [1] r=0 lpr=13 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:12 compute-1 podman[80397]: 2026-01-26 09:41:12.729397055 +0000 UTC m=+0.067585006 container create 78ca290e02e494a62ed6e7db1247d89a717a3aab43caa9273ac85407686f6e4e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Jan 26 09:41:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/209a1586a896d1660d6a6956909ce330bbbef0c072dfc98a2bca17967f7a0a06/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 09:41:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/209a1586a896d1660d6a6956909ce330bbbef0c072dfc98a2bca17967f7a0a06/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 09:41:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/209a1586a896d1660d6a6956909ce330bbbef0c072dfc98a2bca17967f7a0a06/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 09:41:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/209a1586a896d1660d6a6956909ce330bbbef0c072dfc98a2bca17967f7a0a06/merged/var/lib/ceph/mgr/ceph-compute-1.xammti supports timestamps until 2038 (0x7fffffff)
Jan 26 09:41:12 compute-1 podman[80397]: 2026-01-26 09:41:12.688936661 +0000 UTC m=+0.027124632 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:41:12 compute-1 podman[80397]: 2026-01-26 09:41:12.792084206 +0000 UTC m=+0.130272187 container init 78ca290e02e494a62ed6e7db1247d89a717a3aab43caa9273ac85407686f6e4e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 09:41:12 compute-1 podman[80397]: 2026-01-26 09:41:12.797234557 +0000 UTC m=+0.135422508 container start 78ca290e02e494a62ed6e7db1247d89a717a3aab43caa9273ac85407686f6e4e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Jan 26 09:41:12 compute-1 bash[80397]: 78ca290e02e494a62ed6e7db1247d89a717a3aab43caa9273ac85407686f6e4e
Jan 26 09:41:12 compute-1 systemd[1]: Started Ceph mgr.compute-1.xammti for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:41:12 compute-1 sudo[80171]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_auth_request failed to assign global_id
Jan 26 09:41:12 compute-1 ceph-mgr[80416]: set uid:gid to 167:167 (ceph:ceph)
Jan 26 09:41:12 compute-1 ceph-mgr[80416]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 26 09:41:12 compute-1 ceph-mgr[80416]: pidfile_write: ignore empty --pid-file
Jan 26 09:41:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_auth_request failed to assign global_id
Jan 26 09:41:12 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'alerts'
Jan 26 09:41:13 compute-1 ceph-mgr[80416]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 26 09:41:13 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'balancer'
Jan 26 09:41:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:13.158+0000 7f66329a4140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 26 09:41:13 compute-1 ceph-mgr[80416]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 26 09:41:13 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'cephadm'
Jan 26 09:41:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:13.246+0000 7f66329a4140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 26 09:41:13 compute-1 ceph-mon[80107]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 26 09:41:13 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3023141661' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 26 09:41:13 compute-1 ceph-mon[80107]: osdmap e14: 2 total, 2 up, 2 in
Jan 26 09:41:13 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:13 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:13 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e15 e15: 2 total, 2 up, 2 in
Jan 26 09:41:14 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'crash'
Jan 26 09:41:14 compute-1 ceph-mgr[80416]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 26 09:41:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:14.465+0000 7f66329a4140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 26 09:41:14 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'dashboard'
Jan 26 09:41:14 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:14 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:14 compute-1 ceph-mon[80107]: osdmap e15: 2 total, 2 up, 2 in
Jan 26 09:41:14 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 26 09:41:14 compute-1 ceph-mon[80107]: pgmap v82: 3 pgs: 1 creating+peering, 1 unknown, 1 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 26 09:41:14 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Jan 26 09:41:14 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:41:14 compute-1 ceph-mon[80107]: Deploying daemon crash.compute-2 on compute-2
Jan 26 09:41:14 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1199163324' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 26 09:41:14 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e16 e16: 2 total, 2 up, 2 in
Jan 26 09:41:15 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e16 _set_new_cache_sizes cache_size:1019934124 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:41:15 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'devicehealth'
Jan 26 09:41:15 compute-1 ceph-mgr[80416]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 26 09:41:15 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'diskprediction_local'
Jan 26 09:41:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:15.739+0000 7f66329a4140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 26 09:41:15 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e17 e17: 2 total, 2 up, 2 in
Jan 26 09:41:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 26 09:41:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 26 09:41:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]:   from numpy import show_config as show_numpy_config
Jan 26 09:41:15 compute-1 ceph-mgr[80416]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 26 09:41:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:15.989+0000 7f66329a4140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 26 09:41:15 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'influx'
Jan 26 09:41:16 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1199163324' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 26 09:41:16 compute-1 ceph-mon[80107]: osdmap e16: 2 total, 2 up, 2 in
Jan 26 09:41:16 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:16 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:16 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:16 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:16 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 09:41:16 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 09:41:16 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:41:16 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 09:41:16 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:41:16 compute-1 ceph-mon[80107]: pgmap v84: 4 pgs: 1 unknown, 2 active+clean, 1 creating+peering; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 26 09:41:16 compute-1 ceph-mgr[80416]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 26 09:41:16 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'insights'
Jan 26 09:41:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:16.080+0000 7f66329a4140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 26 09:41:16 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'iostat'
Jan 26 09:41:16 compute-1 ceph-mgr[80416]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 26 09:41:16 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'k8sevents'
Jan 26 09:41:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:16.570+0000 7f66329a4140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 26 09:41:17 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'localpool'
Jan 26 09:41:17 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'mds_autoscaler'
Jan 26 09:41:17 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'mirroring'
Jan 26 09:41:17 compute-1 sshd-session[80448]: Connection closed by authenticating user root 152.42.136.110 port 37562 [preauth]
Jan 26 09:41:17 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'nfs'
Jan 26 09:41:18 compute-1 ceph-mgr[80416]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 26 09:41:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:18.045+0000 7f66329a4140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 26 09:41:18 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'orchestrator'
Jan 26 09:41:18 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e18 e18: 2 total, 2 up, 2 in
Jan 26 09:41:18 compute-1 ceph-mon[80107]: osdmap e17: 2 total, 2 up, 2 in
Jan 26 09:41:18 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:18 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1746553743' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 26 09:41:18 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e19 e19: 3 total, 2 up, 3 in
Jan 26 09:41:19 compute-1 ceph-mgr[80416]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 26 09:41:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:19.077+0000 7f66329a4140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 26 09:41:19 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'osd_perf_query'
Jan 26 09:41:19 compute-1 ceph-mgr[80416]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 26 09:41:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:19.175+0000 7f66329a4140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 26 09:41:19 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'osd_support'
Jan 26 09:41:19 compute-1 ceph-mgr[80416]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 26 09:41:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:19.401+0000 7f66329a4140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 26 09:41:19 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'pg_autoscaler'
Jan 26 09:41:19 compute-1 ceph-mgr[80416]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 26 09:41:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:19.833+0000 7f66329a4140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 26 09:41:19 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'progress'
Jan 26 09:41:20 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/4005713193' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "cdaf6859-268c-4a38-b792-ad916b17c334"}]: dispatch
Jan 26 09:41:20 compute-1 ceph-mon[80107]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "cdaf6859-268c-4a38-b792-ad916b17c334"}]: dispatch
Jan 26 09:41:20 compute-1 ceph-mon[80107]: Standby manager daemon compute-2.oynaeu started
Jan 26 09:41:20 compute-1 ceph-mon[80107]: pgmap v86: 4 pgs: 1 unknown, 2 active+clean, 1 creating+peering; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 26 09:41:20 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1746553743' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 26 09:41:20 compute-1 ceph-mon[80107]: osdmap e18: 2 total, 2 up, 2 in
Jan 26 09:41:20 compute-1 ceph-mon[80107]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "cdaf6859-268c-4a38-b792-ad916b17c334"}]': finished
Jan 26 09:41:20 compute-1 ceph-mon[80107]: osdmap e19: 3 total, 2 up, 3 in
Jan 26 09:41:20 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 26 09:41:20 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3839283137' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Jan 26 09:41:20 compute-1 ceph-mon[80107]: mgrmap e10: compute-0.zllcia(active, since 2m), standbys: compute-2.oynaeu
Jan 26 09:41:20 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mgr metadata", "who": "compute-2.oynaeu", "id": "compute-2.oynaeu"}]: dispatch
Jan 26 09:41:20 compute-1 ceph-mgr[80416]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 26 09:41:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:20.230+0000 7f66329a4140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 26 09:41:20 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'prometheus'
Jan 26 09:41:20 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e19 _set_new_cache_sizes cache_size:1020053172 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:41:21 compute-1 ceph-mgr[80416]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 26 09:41:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:21.455+0000 7f66329a4140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 26 09:41:21 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'rbd_support'
Jan 26 09:41:21 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e20 e20: 3 total, 2 up, 3 in
Jan 26 09:41:21 compute-1 ceph-mgr[80416]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 26 09:41:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:21.596+0000 7f66329a4140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 26 09:41:21 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'restful'
Jan 26 09:41:21 compute-1 ceph-mon[80107]: pgmap v89: 5 pgs: 1 creating+peering, 4 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 26 09:41:21 compute-1 ceph-mon[80107]: Health check update: 4 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 26 09:41:21 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/929823694' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 26 09:41:21 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'rgw'
Jan 26 09:41:22 compute-1 ceph-mgr[80416]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 26 09:41:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:22.316+0000 7f66329a4140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 26 09:41:22 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'rook'
Jan 26 09:41:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e21 e21: 3 total, 2 up, 3 in
Jan 26 09:41:23 compute-1 ceph-mon[80107]: osdmap e20: 3 total, 2 up, 3 in
Jan 26 09:41:23 compute-1 ceph-mon[80107]: pgmap v91: 5 pgs: 1 creating+peering, 4 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 26 09:41:23 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 26 09:41:23 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/929823694' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 26 09:41:23 compute-1 ceph-mon[80107]: osdmap e21: 3 total, 2 up, 3 in
Jan 26 09:41:23 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 26 09:41:23 compute-1 ceph-mgr[80416]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 26 09:41:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:23.583+0000 7f66329a4140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 26 09:41:23 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'selftest'
Jan 26 09:41:23 compute-1 ceph-mgr[80416]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 26 09:41:23 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'snap_schedule'
Jan 26 09:41:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:23.674+0000 7f66329a4140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 26 09:41:23 compute-1 ceph-mgr[80416]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 26 09:41:23 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'stats'
Jan 26 09:41:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:23.780+0000 7f66329a4140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 26 09:41:23 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'status'
Jan 26 09:41:23 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e22 e22: 3 total, 2 up, 3 in
Jan 26 09:41:23 compute-1 ceph-mgr[80416]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 26 09:41:23 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'telegraf'
Jan 26 09:41:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:23.966+0000 7f66329a4140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 26 09:41:24 compute-1 ceph-mgr[80416]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 26 09:41:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:24.054+0000 7f66329a4140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 26 09:41:24 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'telemetry'
Jan 26 09:41:24 compute-1 ceph-mon[80107]: pgmap v93: 6 pgs: 1 creating+peering, 5 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 26 09:41:24 compute-1 ceph-mon[80107]: osdmap e22: 3 total, 2 up, 3 in
Jan 26 09:41:24 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 26 09:41:24 compute-1 ceph-mgr[80416]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 26 09:41:24 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'test_orchestrator'
Jan 26 09:41:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:24.693+0000 7f66329a4140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 26 09:41:25 compute-1 ceph-mgr[80416]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 26 09:41:25 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'volumes'
Jan 26 09:41:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:25.152+0000 7f66329a4140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 26 09:41:25 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e23 e23: 3 total, 2 up, 3 in
Jan 26 09:41:25 compute-1 ceph-mgr[80416]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 26 09:41:25 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'zabbix'
Jan 26 09:41:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:25.484+0000 7f66329a4140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 26 09:41:25 compute-1 ceph-mgr[80416]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 26 09:41:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:25.575+0000 7f66329a4140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 26 09:41:25 compute-1 ceph-mgr[80416]: ms_deliver_dispatch: unhandled message 0x561413592d00 mon_map magic: 0 from mon.0 v2:192.168.122.100:3300/0
Jan 26 09:41:25 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e23 _set_new_cache_sizes cache_size:1020054711 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:41:26 compute-1 ceph-mon[80107]: osdmap e23: 3 total, 2 up, 3 in
Jan 26 09:41:26 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 26 09:41:26 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1985194690' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 26 09:41:26 compute-1 ceph-mon[80107]: Standby manager daemon compute-1.xammti started
Jan 26 09:41:26 compute-1 ceph-mon[80107]: pgmap v96: 6 pgs: 1 creating+peering, 5 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 26 09:41:26 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e24 e24: 3 total, 2 up, 3 in
Jan 26 09:41:27 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 24 pg[7.0( empty local-lis/les=0/0 n=0 ec=24/24 lis/c=0/0 les/c/f=0/0/0 sis=24) [1] r=0 lpr=24 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:28 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1985194690' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 26 09:41:28 compute-1 ceph-mon[80107]: mgrmap e11: compute-0.zllcia(active, since 2m), standbys: compute-2.oynaeu, compute-1.xammti
Jan 26 09:41:28 compute-1 ceph-mon[80107]: osdmap e24: 3 total, 2 up, 3 in
Jan 26 09:41:28 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 26 09:41:28 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mgr metadata", "who": "compute-1.xammti", "id": "compute-1.xammti"}]: dispatch
Jan 26 09:41:28 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Jan 26 09:41:28 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:41:28 compute-1 ceph-mon[80107]: Health check update: 5 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 26 09:41:28 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e25 e25: 3 total, 2 up, 3 in
Jan 26 09:41:28 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 25 pg[7.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=0/0 les/c/f=0/0/0 sis=24) [1] r=0 lpr=24 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:29 compute-1 ceph-mon[80107]: Deploying daemon osd.2 on compute-2
Jan 26 09:41:29 compute-1 ceph-mon[80107]: pgmap v98: 7 pgs: 1 unknown, 1 creating+peering, 5 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 26 09:41:29 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2556293450' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]: dispatch
Jan 26 09:41:29 compute-1 ceph-mon[80107]: osdmap e25: 3 total, 2 up, 3 in
Jan 26 09:41:29 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 26 09:41:29 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:30 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e26 e26: 3 total, 2 up, 3 in
Jan 26 09:41:30 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e26 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:41:30 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:30 compute-1 ceph-mon[80107]: pgmap v100: 7 pgs: 7 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 26 09:41:30 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2556293450' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Jan 26 09:41:30 compute-1 ceph-mon[80107]: osdmap e26: 3 total, 2 up, 3 in
Jan 26 09:41:30 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 26 09:41:32 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:32 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:32 compute-1 ceph-mon[80107]: pgmap v102: 7 pgs: 7 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 26 09:41:32 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1132920361' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]: dispatch
Jan 26 09:41:32 compute-1 ceph-mon[80107]: Health check update: 6 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 26 09:41:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e27 e27: 3 total, 2 up, 3 in
Jan 26 09:41:33 compute-1 sudo[80450]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 09:41:33 compute-1 sudo[80450]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:33 compute-1 sudo[80450]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:33 compute-1 sudo[80475]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:41:33 compute-1 sudo[80475]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:33 compute-1 sudo[80475]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:33 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e28 e28: 3 total, 2 up, 3 in
Jan 26 09:41:33 compute-1 sudo[80500]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 26 09:41:33 compute-1 sudo[80500]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:34 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1132920361' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Jan 26 09:41:34 compute-1 ceph-mon[80107]: osdmap e27: 3 total, 2 up, 3 in
Jan 26 09:41:34 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 26 09:41:34 compute-1 ceph-mon[80107]: from='osd.2 [v2:192.168.122.102:6800/4046341804,v1:192.168.122.102:6801/4046341804]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 26 09:41:34 compute-1 ceph-mon[80107]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 26 09:41:34 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:34 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:35 compute-1 podman[80594]: 2026-01-26 09:41:35.082228092 +0000 UTC m=+0.119316858 container exec 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:41:35 compute-1 podman[80594]: 2026-01-26 09:41:35.224078065 +0000 UTC m=+0.261166791 container exec_died 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 09:41:35 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e29 e29: 3 total, 2 up, 3 in
Jan 26 09:41:35 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e29 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:41:35 compute-1 sudo[80500]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:35 compute-1 ceph-mon[80107]: pgmap v104: 7 pgs: 7 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 26 09:41:35 compute-1 ceph-mon[80107]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Jan 26 09:41:35 compute-1 ceph-mon[80107]: osdmap e28: 3 total, 2 up, 3 in
Jan 26 09:41:35 compute-1 ceph-mon[80107]: from='osd.2 [v2:192.168.122.102:6800/4046341804,v1:192.168.122.102:6801/4046341804]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 26 09:41:35 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 26 09:41:35 compute-1 ceph-mon[80107]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 26 09:41:35 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2154943776' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Jan 26 09:41:35 compute-1 ceph-mon[80107]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]': finished
Jan 26 09:41:35 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2154943776' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Jan 26 09:41:35 compute-1 ceph-mon[80107]: osdmap e29: 3 total, 2 up, 3 in
Jan 26 09:41:35 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 26 09:41:36 compute-1 sshd-session[80669]: Invalid user admin from 104.248.198.71 port 37628
Jan 26 09:41:36 compute-1 sshd-session[80669]: Connection closed by invalid user admin 104.248.198.71 port 37628 [preauth]
Jan 26 09:41:36 compute-1 sudo[80677]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:41:36 compute-1 sudo[80677]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:36 compute-1 sudo[80677]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:36 compute-1 sudo[80702]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 09:41:36 compute-1 sudo[80702]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:37 compute-1 sudo[80702]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:37 compute-1 ceph-mon[80107]: purged_snaps scrub starts
Jan 26 09:41:37 compute-1 ceph-mon[80107]: purged_snaps scrub ok
Jan 26 09:41:37 compute-1 ceph-mon[80107]: pgmap v107: 7 pgs: 7 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 26 09:41:37 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:37 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:37 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:37 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 26 09:41:37 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:37 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2817412888' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Jan 26 09:41:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e30 e30: 3 total, 2 up, 3 in
Jan 26 09:41:38 compute-1 ceph-mon[80107]: Health check update: 3 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 26 09:41:38 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 26 09:41:38 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2817412888' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Jan 26 09:41:38 compute-1 ceph-mon[80107]: osdmap e30: 3 total, 2 up, 3 in
Jan 26 09:41:38 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 26 09:41:38 compute-1 ceph-mon[80107]: pgmap v109: 7 pgs: 7 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 26 09:41:39 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 26 09:41:39 compute-1 sudo[80758]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Jan 26 09:41:39 compute-1 sudo[80758]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:39 compute-1 sudo[80758]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:39 compute-1 sudo[80783]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph
Jan 26 09:41:39 compute-1 sudo[80783]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:39 compute-1 sudo[80783]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:39 compute-1 sudo[80808]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.conf.new
Jan 26 09:41:39 compute-1 sudo[80808]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:39 compute-1 sudo[80808]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:39 compute-1 sudo[80833]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:41:39 compute-1 sudo[80833]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:39 compute-1 sudo[80833]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:39 compute-1 sudo[80858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.conf.new
Jan 26 09:41:39 compute-1 sudo[80858]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:39 compute-1 sudo[80858]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:39 compute-1 sudo[80906]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.conf.new
Jan 26 09:41:39 compute-1 sudo[80906]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:39 compute-1 sudo[80906]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:40 compute-1 sudo[80931]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.conf.new
Jan 26 09:41:40 compute-1 sudo[80931]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:40 compute-1 sudo[80931]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:40 compute-1 sudo[80956]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Jan 26 09:41:40 compute-1 sudo[80956]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:40 compute-1 sudo[80956]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:40 compute-1 sudo[80981]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config
Jan 26 09:41:40 compute-1 sudo[80981]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:40 compute-1 sudo[80981]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:40 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e31 e31: 3 total, 2 up, 3 in
Jan 26 09:41:40 compute-1 sudo[81006]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config
Jan 26 09:41:40 compute-1 sudo[81006]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:40 compute-1 sudo[81006]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:40 compute-1 sudo[81031]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf.new
Jan 26 09:41:40 compute-1 sudo[81031]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:40 compute-1 sudo[81031]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:40 compute-1 sudo[81056]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:41:40 compute-1 sudo[81056]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:40 compute-1 sudo[81056]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:40 compute-1 sudo[81081]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf.new
Jan 26 09:41:40 compute-1 sudo[81081]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:40 compute-1 sudo[81081]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:40 compute-1 sudo[81129]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf.new
Jan 26 09:41:40 compute-1 sudo[81129]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:40 compute-1 sudo[81129]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:40 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:41:40 compute-1 sudo[81154]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf.new
Jan 26 09:41:40 compute-1 sudo[81154]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:40 compute-1 sudo[81154]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:40 compute-1 sudo[81179]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf.new /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 09:41:40 compute-1 sudo[81179]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:40 compute-1 sudo[81179]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:41 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3209549506' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Jan 26 09:41:41 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 26 09:41:41 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:41 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:41 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Jan 26 09:41:41 compute-1 ceph-mon[80107]: Adjusting osd_memory_target on compute-2 to 127.9M
Jan 26 09:41:41 compute-1 ceph-mon[80107]: Unable to set osd_memory_target on compute-2 to 134209126: error parsing value: Value '134209126' is below minimum 939524096
Jan 26 09:41:41 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:41:41 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 09:41:41 compute-1 ceph-mon[80107]: Updating compute-0:/etc/ceph/ceph.conf
Jan 26 09:41:41 compute-1 ceph-mon[80107]: Updating compute-1:/etc/ceph/ceph.conf
Jan 26 09:41:41 compute-1 ceph-mon[80107]: Updating compute-2:/etc/ceph/ceph.conf
Jan 26 09:41:41 compute-1 ceph-mon[80107]: pgmap v110: 7 pgs: 7 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 26 09:41:41 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e32 e32: 3 total, 3 up, 3 in
Jan 26 09:41:42 compute-1 ceph-mon[80107]: OSD bench result of 4662.749970 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 26 09:41:42 compute-1 ceph-mon[80107]: Updating compute-1:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 09:41:42 compute-1 ceph-mon[80107]: Updating compute-0:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 09:41:42 compute-1 ceph-mon[80107]: Updating compute-2:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 09:41:42 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3209549506' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Jan 26 09:41:42 compute-1 ceph-mon[80107]: osdmap e31: 3 total, 2 up, 3 in
Jan 26 09:41:42 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 26 09:41:42 compute-1 ceph-mon[80107]: pgmap v112: 7 pgs: 7 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 26 09:41:42 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 26 09:41:42 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:42 compute-1 ceph-mon[80107]: osd.2 [v2:192.168.122.102:6800/4046341804,v1:192.168.122.102:6801/4046341804] boot
Jan 26 09:41:42 compute-1 ceph-mon[80107]: osdmap e32: 3 total, 3 up, 3 in
Jan 26 09:41:42 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 26 09:41:42 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:42 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:42 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:42 compute-1 sshd-session[81204]: Connection closed by authenticating user root 178.62.249.31 port 54060 [preauth]
Jan 26 09:41:44 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e33 e33: 3 total, 3 up, 3 in
Jan 26 09:41:44 compute-1 ceph-mon[80107]: Health check update: 2 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 26 09:41:44 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:44 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:44 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:44 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 09:41:44 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 09:41:44 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:41:44 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3224045909' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Jan 26 09:41:45 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e34 e34: 3 total, 3 up, 3 in
Jan 26 09:41:45 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e34 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:41:45 compute-1 ceph-mon[80107]: pgmap v114: 7 pgs: 2 peering, 5 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:41:45 compute-1 ceph-mon[80107]: osdmap e33: 3 total, 3 up, 3 in
Jan 26 09:41:45 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3224045909' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Jan 26 09:41:45 compute-1 ceph-mon[80107]: osdmap e34: 3 total, 3 up, 3 in
Jan 26 09:41:47 compute-1 ceph-mon[80107]: pgmap v117: 7 pgs: 2 peering, 5 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:41:47 compute-1 ceph-mon[80107]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 26 09:41:47 compute-1 ceph-mon[80107]: Cluster is now healthy
Jan 26 09:41:47 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 09:41:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e35 e35: 3 total, 3 up, 3 in
Jan 26 09:41:48 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Jan 26 09:41:48 compute-1 ceph-mon[80107]: osdmap e35: 3 total, 3 up, 3 in
Jan 26 09:41:48 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 09:41:48 compute-1 ceph-mon[80107]: pgmap v119: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:41:48 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 09:41:48 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e36 e36: 3 total, 3 up, 3 in
Jan 26 09:41:48 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 36 pg[2.0( empty local-lis/les=13/14 n=0 ec=13/13 lis/c=13/13 les/c/f=14/14/0 sis=36 pruub=11.864285469s) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active pruub 98.601074219s@ mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:41:48 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 36 pg[2.0( empty local-lis/les=13/14 n=0 ec=13/13 lis/c=13/13 les/c/f=14/14/0 sis=36 pruub=11.864285469s) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown pruub 98.601074219s@ mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e37 e37: 3 total, 3 up, 3 in
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1f( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1d( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1e( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1c( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1b( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.9( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.8( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.a( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.7( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.6( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.4( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.2( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.5( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.3( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.c( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.b( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.d( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.e( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.f( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.10( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.11( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.12( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.13( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.14( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.16( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.17( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.15( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.18( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1a( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.19( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1f( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1d( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Jan 26 09:41:49 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 09:41:49 compute-1 ceph-mon[80107]: osdmap e36: 3 total, 3 up, 3 in
Jan 26 09:41:49 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 09:41:49 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.8( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1e( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.a( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.9( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1b( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.7( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.4( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1c( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.5( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.0( empty local-lis/les=36/37 n=0 ec=13/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.b( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.2( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.d( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.e( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.f( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.10( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.c( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.11( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.14( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.13( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.17( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.16( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.12( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.15( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.18( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.19( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1a( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.3( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.6( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:50 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1f scrub starts
Jan 26 09:41:50 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1f scrub ok
Jan 26 09:41:50 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e37 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:41:50 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e38 e38: 3 total, 3 up, 3 in
Jan 26 09:41:50 compute-1 ceph-mon[80107]: pgmap v121: 38 pgs: 1 peering, 31 unknown, 6 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:41:50 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Jan 26 09:41:50 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 09:41:50 compute-1 ceph-mon[80107]: osdmap e37: 3 total, 3 up, 3 in
Jan 26 09:41:50 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 09:41:51 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1d deep-scrub starts
Jan 26 09:41:51 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1d deep-scrub ok
Jan 26 09:41:52 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1e scrub starts
Jan 26 09:41:52 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1e scrub ok
Jan 26 09:41:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e39 e39: 3 total, 3 up, 3 in
Jan 26 09:41:52 compute-1 ceph-mon[80107]: 2.1f scrub starts
Jan 26 09:41:52 compute-1 ceph-mon[80107]: 2.1f scrub ok
Jan 26 09:41:52 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Jan 26 09:41:52 compute-1 ceph-mon[80107]: osdmap e38: 3 total, 3 up, 3 in
Jan 26 09:41:52 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 09:41:52 compute-1 ceph-mon[80107]: 3.1f scrub starts
Jan 26 09:41:52 compute-1 ceph-mon[80107]: 3.1f scrub ok
Jan 26 09:41:52 compute-1 ceph-mon[80107]: pgmap v124: 69 pgs: 1 peering, 62 unknown, 6 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:41:52 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 09:41:52 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 09:41:52 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/4014478342' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Jan 26 09:41:52 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/4014478342' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Jan 26 09:41:53 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.9 deep-scrub starts
Jan 26 09:41:53 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.9 deep-scrub ok
Jan 26 09:41:53 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e40 e40: 3 total, 3 up, 3 in
Jan 26 09:41:53 compute-1 ceph-mon[80107]: 2.1d deep-scrub starts
Jan 26 09:41:53 compute-1 ceph-mon[80107]: 2.1d deep-scrub ok
Jan 26 09:41:53 compute-1 ceph-mon[80107]: 2.1e scrub starts
Jan 26 09:41:53 compute-1 ceph-mon[80107]: 2.1e scrub ok
Jan 26 09:41:53 compute-1 ceph-mon[80107]: 3.1d scrub starts
Jan 26 09:41:53 compute-1 ceph-mon[80107]: 3.1d scrub ok
Jan 26 09:41:53 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]': finished
Jan 26 09:41:53 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 09:41:53 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 09:41:53 compute-1 ceph-mon[80107]: osdmap e39: 3 total, 3 up, 3 in
Jan 26 09:41:53 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 09:41:53 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1576342763' entity='client.admin' 
Jan 26 09:41:53 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Jan 26 09:41:53 compute-1 ceph-mon[80107]: osdmap e40: 3 total, 3 up, 3 in
Jan 26 09:41:53 compute-1 sudo[81206]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 09:41:53 compute-1 sudo[81206]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:53 compute-1 sudo[81206]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:54 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.8 deep-scrub starts
Jan 26 09:41:54 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.8 deep-scrub ok
Jan 26 09:41:54 compute-1 ceph-mon[80107]: 2.9 deep-scrub starts
Jan 26 09:41:54 compute-1 ceph-mon[80107]: 2.9 deep-scrub ok
Jan 26 09:41:54 compute-1 ceph-mon[80107]: 3.1e scrub starts
Jan 26 09:41:54 compute-1 ceph-mon[80107]: 3.1e scrub ok
Jan 26 09:41:54 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:54 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:54 compute-1 ceph-mon[80107]: 4.1f scrub starts
Jan 26 09:41:54 compute-1 ceph-mon[80107]: 4.1f scrub ok
Jan 26 09:41:54 compute-1 ceph-mon[80107]: pgmap v127: 131 pgs: 2 peering, 93 unknown, 36 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:41:54 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 09:41:54 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 09:41:54 compute-1 ceph-mon[80107]: Reconfiguring mon.compute-0 (monmap changed)...
Jan 26 09:41:54 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 26 09:41:54 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Jan 26 09:41:54 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:41:54 compute-1 ceph-mon[80107]: Reconfiguring daemon mon.compute-0 on compute-0
Jan 26 09:41:54 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:54 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:54 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:54 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:54 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.zllcia", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 26 09:41:54 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 26 09:41:54 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:41:54 compute-1 ceph-mon[80107]: 3.b scrub starts
Jan 26 09:41:54 compute-1 ceph-mon[80107]: 3.b scrub ok
Jan 26 09:41:54 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e41 e41: 3 total, 3 up, 3 in
Jan 26 09:41:55 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1b scrub starts
Jan 26 09:41:55 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1b scrub ok
Jan 26 09:41:55 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e42 e42: 3 total, 3 up, 3 in
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 41 pg[7.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.039972305s) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active pruub 106.636978149s@ mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.039972305s) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown pruub 106.636978149s@ mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.4( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.5( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.c( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.d( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.e( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.2( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.f( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.3( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.1( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.6( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.7( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.8( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.9( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.a( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.b( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.10( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.11( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.13( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.12( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.14( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.15( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.16( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.17( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.19( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.18( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.1a( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.1b( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.1c( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.1d( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.1e( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.1f( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:41:55 compute-1 ceph-mon[80107]: from='client.14298 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 09:41:55 compute-1 ceph-mon[80107]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 26 09:41:55 compute-1 ceph-mon[80107]: Saving service ingress.rgw.default spec with placement count:2
Jan 26 09:41:55 compute-1 ceph-mon[80107]: 2.8 deep-scrub starts
Jan 26 09:41:55 compute-1 ceph-mon[80107]: 2.8 deep-scrub ok
Jan 26 09:41:55 compute-1 ceph-mon[80107]: Reconfiguring mgr.compute-0.zllcia (monmap changed)...
Jan 26 09:41:55 compute-1 ceph-mon[80107]: Reconfiguring daemon mgr.compute-0.zllcia on compute-0
Jan 26 09:41:55 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 09:41:55 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 09:41:55 compute-1 ceph-mon[80107]: osdmap e41: 3 total, 3 up, 3 in
Jan 26 09:41:55 compute-1 ceph-mon[80107]: 4.7 deep-scrub starts
Jan 26 09:41:55 compute-1 ceph-mon[80107]: 4.7 deep-scrub ok
Jan 26 09:41:55 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:55 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:55 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 26 09:41:55 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:41:55 compute-1 ceph-mon[80107]: 3.a scrub starts
Jan 26 09:41:55 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:41:56 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1 deep-scrub starts
Jan 26 09:41:56 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1 deep-scrub ok
Jan 26 09:41:56 compute-1 sudo[81231]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:41:56 compute-1 sudo[81231]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:56 compute-1 sudo[81231]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:56 compute-1 sudo[81256]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:41:56 compute-1 sudo[81256]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:56 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e43 e43: 3 total, 3 up, 3 in
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.1c( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.13( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.1f( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-mon[80107]: Reconfiguring crash.compute-0 (monmap changed)...
Jan 26 09:41:56 compute-1 ceph-mon[80107]: Reconfiguring daemon crash.compute-0 on compute-0
Jan 26 09:41:56 compute-1 ceph-mon[80107]: 2.1b scrub starts
Jan 26 09:41:56 compute-1 ceph-mon[80107]: 2.1b scrub ok
Jan 26 09:41:56 compute-1 ceph-mon[80107]: 3.a scrub ok
Jan 26 09:41:56 compute-1 ceph-mon[80107]: 4.b deep-scrub starts
Jan 26 09:41:56 compute-1 ceph-mon[80107]: osdmap e42: 3 total, 3 up, 3 in
Jan 26 09:41:56 compute-1 ceph-mon[80107]: 4.b deep-scrub ok
Jan 26 09:41:56 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:56 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:56 compute-1 ceph-mon[80107]: Reconfiguring osd.0 (monmap changed)...
Jan 26 09:41:56 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Jan 26 09:41:56 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:41:56 compute-1 ceph-mon[80107]: Reconfiguring daemon osd.0 on compute-0
Jan 26 09:41:56 compute-1 ceph-mon[80107]: pgmap v130: 193 pgs: 1 peering, 124 unknown, 68 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:41:56 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:56 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:56 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:56 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:56 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:56 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:56 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 26 09:41:56 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:41:56 compute-1 ceph-mon[80107]: 3.1c scrub starts
Jan 26 09:41:56 compute-1 ceph-mon[80107]: 3.1c scrub ok
Jan 26 09:41:56 compute-1 ceph-mon[80107]: osdmap e43: 3 total, 3 up, 3 in
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.1d( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.10( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.12( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.16( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.17( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.11( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.14( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.15( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.b( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.a( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.e( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.8( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.6( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.0( empty local-lis/les=41/43 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.9( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.5( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.4( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.1( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.7( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.3( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.c( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.2( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.d( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.1e( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.f( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.19( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.18( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.1b( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.1a( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:41:56 compute-1 podman[81297]: 2026-01-26 09:41:56.703684593 +0000 UTC m=+0.044999409 container create b29931f2f71444d70514aabec122e1b84092c0101c95c0d86b2c2cb4570f2332 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_chaplygin, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default)
Jan 26 09:41:56 compute-1 systemd[72713]: Starting Mark boot as successful...
Jan 26 09:41:56 compute-1 systemd[72713]: Finished Mark boot as successful.
Jan 26 09:41:56 compute-1 systemd[1]: Started libpod-conmon-b29931f2f71444d70514aabec122e1b84092c0101c95c0d86b2c2cb4570f2332.scope.
Jan 26 09:41:56 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:41:56 compute-1 podman[81297]: 2026-01-26 09:41:56.683823281 +0000 UTC m=+0.025138117 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:41:56 compute-1 podman[81297]: 2026-01-26 09:41:56.785952549 +0000 UTC m=+0.127267385 container init b29931f2f71444d70514aabec122e1b84092c0101c95c0d86b2c2cb4570f2332 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_chaplygin, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:41:56 compute-1 podman[81297]: 2026-01-26 09:41:56.795930851 +0000 UTC m=+0.137245667 container start b29931f2f71444d70514aabec122e1b84092c0101c95c0d86b2c2cb4570f2332 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_chaplygin, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325)
Jan 26 09:41:56 compute-1 podman[81297]: 2026-01-26 09:41:56.802324066 +0000 UTC m=+0.143638902 container attach b29931f2f71444d70514aabec122e1b84092c0101c95c0d86b2c2cb4570f2332 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_chaplygin, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, io.buildah.version=1.40.1)
Jan 26 09:41:56 compute-1 infallible_chaplygin[81314]: 167 167
Jan 26 09:41:56 compute-1 systemd[1]: libpod-b29931f2f71444d70514aabec122e1b84092c0101c95c0d86b2c2cb4570f2332.scope: Deactivated successfully.
Jan 26 09:41:56 compute-1 podman[81297]: 2026-01-26 09:41:56.806649424 +0000 UTC m=+0.147964270 container died b29931f2f71444d70514aabec122e1b84092c0101c95c0d86b2c2cb4570f2332 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_chaplygin, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Jan 26 09:41:56 compute-1 systemd[1]: var-lib-containers-storage-overlay-7b09fe1a64abaaa3d1bc139993fab6f61383ba137d0e44997a3af79b23d69153-merged.mount: Deactivated successfully.
Jan 26 09:41:56 compute-1 podman[81297]: 2026-01-26 09:41:56.850682446 +0000 UTC m=+0.191997262 container remove b29931f2f71444d70514aabec122e1b84092c0101c95c0d86b2c2cb4570f2332 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_chaplygin, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.schema-version=1.0)
Jan 26 09:41:56 compute-1 systemd[1]: libpod-conmon-b29931f2f71444d70514aabec122e1b84092c0101c95c0d86b2c2cb4570f2332.scope: Deactivated successfully.
Jan 26 09:41:56 compute-1 sudo[81256]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:57 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.7 scrub starts
Jan 26 09:41:57 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.7 scrub ok
Jan 26 09:41:57 compute-1 sudo[81331]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:41:57 compute-1 sudo[81331]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:57 compute-1 sudo[81331]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:57 compute-1 sudo[81356]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:41:57 compute-1 sudo[81356]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:57 compute-1 podman[81397]: 2026-01-26 09:41:57.932317524 +0000 UTC m=+0.062073736 container create cccf79dfe42342f2e3443718be4ba9c6062e257ccf57674f6bf35544b8066bf9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=reverent_cohen, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1)
Jan 26 09:41:57 compute-1 ceph-mon[80107]: 2.1 deep-scrub starts
Jan 26 09:41:57 compute-1 ceph-mon[80107]: 2.1 deep-scrub ok
Jan 26 09:41:57 compute-1 ceph-mon[80107]: from='client.14304 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 09:41:57 compute-1 ceph-mon[80107]: Saving service node-exporter spec with placement *
Jan 26 09:41:57 compute-1 ceph-mon[80107]: Saving service grafana spec with placement compute-0;count:1
Jan 26 09:41:57 compute-1 ceph-mon[80107]: Saving service prometheus spec with placement compute-0;count:1
Jan 26 09:41:57 compute-1 ceph-mon[80107]: Saving service alertmanager spec with placement compute-0;count:1
Jan 26 09:41:57 compute-1 ceph-mon[80107]: Reconfiguring crash.compute-1 (monmap changed)...
Jan 26 09:41:57 compute-1 ceph-mon[80107]: Reconfiguring daemon crash.compute-1 on compute-1
Jan 26 09:41:57 compute-1 ceph-mon[80107]: 4.6 deep-scrub starts
Jan 26 09:41:57 compute-1 ceph-mon[80107]: 4.6 deep-scrub ok
Jan 26 09:41:57 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:57 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:57 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:57 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Jan 26 09:41:57 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:41:57 compute-1 systemd[1]: Started libpod-conmon-cccf79dfe42342f2e3443718be4ba9c6062e257ccf57674f6bf35544b8066bf9.scope.
Jan 26 09:41:57 compute-1 podman[81397]: 2026-01-26 09:41:57.901174004 +0000 UTC m=+0.030930246 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:41:58 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:41:58 compute-1 podman[81397]: 2026-01-26 09:41:58.025889928 +0000 UTC m=+0.155646190 container init cccf79dfe42342f2e3443718be4ba9c6062e257ccf57674f6bf35544b8066bf9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=reverent_cohen, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=squid)
Jan 26 09:41:58 compute-1 podman[81397]: 2026-01-26 09:41:58.038415054 +0000 UTC m=+0.168171256 container start cccf79dfe42342f2e3443718be4ba9c6062e257ccf57674f6bf35544b8066bf9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=reverent_cohen, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS)
Jan 26 09:41:58 compute-1 podman[81397]: 2026-01-26 09:41:58.042758224 +0000 UTC m=+0.172514496 container attach cccf79dfe42342f2e3443718be4ba9c6062e257ccf57674f6bf35544b8066bf9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=reverent_cohen, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 09:41:58 compute-1 reverent_cohen[81413]: 167 167
Jan 26 09:41:58 compute-1 systemd[1]: libpod-cccf79dfe42342f2e3443718be4ba9c6062e257ccf57674f6bf35544b8066bf9.scope: Deactivated successfully.
Jan 26 09:41:58 compute-1 conmon[81413]: conmon cccf79dfe42342f2e344 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-cccf79dfe42342f2e3443718be4ba9c6062e257ccf57674f6bf35544b8066bf9.scope/container/memory.events
Jan 26 09:41:58 compute-1 podman[81397]: 2026-01-26 09:41:58.046461187 +0000 UTC m=+0.176217399 container died cccf79dfe42342f2e3443718be4ba9c6062e257ccf57674f6bf35544b8066bf9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=reverent_cohen, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Jan 26 09:41:58 compute-1 systemd[1]: var-lib-containers-storage-overlay-6b509946277c5ffa87f095919ad2a2bff52212c4129d80b8e30f1f974f708959-merged.mount: Deactivated successfully.
Jan 26 09:41:58 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.4 scrub starts
Jan 26 09:41:58 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.4 scrub ok
Jan 26 09:41:58 compute-1 podman[81397]: 2026-01-26 09:41:58.094145717 +0000 UTC m=+0.223901889 container remove cccf79dfe42342f2e3443718be4ba9c6062e257ccf57674f6bf35544b8066bf9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=reverent_cohen, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Jan 26 09:41:58 compute-1 systemd[1]: libpod-conmon-cccf79dfe42342f2e3443718be4ba9c6062e257ccf57674f6bf35544b8066bf9.scope: Deactivated successfully.
Jan 26 09:41:58 compute-1 sudo[81356]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:58 compute-1 sudo[81437]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:41:58 compute-1 sudo[81437]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:58 compute-1 sudo[81437]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:58 compute-1 sudo[81462]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:41:58 compute-1 sudo[81462]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:41:58 compute-1 podman[81501]: 2026-01-26 09:41:58.914959797 +0000 UTC m=+0.036147222 container create cd06c5cdd94519ec9b0c636ea140ff9c7b9c570d14bb140c1466a6aadb6fdef9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_curran, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 09:41:58 compute-1 ceph-mon[80107]: 2.7 scrub starts
Jan 26 09:41:58 compute-1 ceph-mon[80107]: 2.7 scrub ok
Jan 26 09:41:58 compute-1 ceph-mon[80107]: 3.9 scrub starts
Jan 26 09:41:58 compute-1 ceph-mon[80107]: Reconfiguring osd.1 (monmap changed)...
Jan 26 09:41:58 compute-1 ceph-mon[80107]: 3.9 scrub ok
Jan 26 09:41:58 compute-1 ceph-mon[80107]: Reconfiguring daemon osd.1 on compute-1
Jan 26 09:41:58 compute-1 ceph-mon[80107]: 4.a scrub starts
Jan 26 09:41:58 compute-1 ceph-mon[80107]: 4.a scrub ok
Jan 26 09:41:58 compute-1 ceph-mon[80107]: pgmap v132: 193 pgs: 1 peering, 93 unknown, 99 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:41:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/468906199' entity='client.admin' 
Jan 26 09:41:58 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:58 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:41:58 compute-1 ceph-mon[80107]: Reconfiguring mon.compute-1 (monmap changed)...
Jan 26 09:41:58 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 26 09:41:58 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Jan 26 09:41:58 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:41:58 compute-1 ceph-mon[80107]: Reconfiguring daemon mon.compute-1 on compute-1
Jan 26 09:41:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/4184262969' entity='client.admin' 
Jan 26 09:41:58 compute-1 systemd[1]: Started libpod-conmon-cd06c5cdd94519ec9b0c636ea140ff9c7b9c570d14bb140c1466a6aadb6fdef9.scope.
Jan 26 09:41:58 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:41:58 compute-1 podman[81501]: 2026-01-26 09:41:58.984309697 +0000 UTC m=+0.105497152 container init cd06c5cdd94519ec9b0c636ea140ff9c7b9c570d14bb140c1466a6aadb6fdef9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_curran, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2)
Jan 26 09:41:58 compute-1 podman[81501]: 2026-01-26 09:41:58.989519291 +0000 UTC m=+0.110706716 container start cd06c5cdd94519ec9b0c636ea140ff9c7b9c570d14bb140c1466a6aadb6fdef9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_curran, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 09:41:58 compute-1 cool_curran[81517]: 167 167
Jan 26 09:41:58 compute-1 systemd[1]: libpod-cd06c5cdd94519ec9b0c636ea140ff9c7b9c570d14bb140c1466a6aadb6fdef9.scope: Deactivated successfully.
Jan 26 09:41:58 compute-1 podman[81501]: 2026-01-26 09:41:58.898786919 +0000 UTC m=+0.019974364 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:41:58 compute-1 podman[81501]: 2026-01-26 09:41:58.995153037 +0000 UTC m=+0.116340482 container attach cd06c5cdd94519ec9b0c636ea140ff9c7b9c570d14bb140c1466a6aadb6fdef9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_curran, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Jan 26 09:41:58 compute-1 podman[81501]: 2026-01-26 09:41:58.995681703 +0000 UTC m=+0.116869128 container died cd06c5cdd94519ec9b0c636ea140ff9c7b9c570d14bb140c1466a6aadb6fdef9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_curran, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:41:59 compute-1 systemd[1]: var-lib-containers-storage-overlay-95fe52c1ffe4c834bd214a626e765ebc17c43ad3d55c91808874fba0a5fa77c4-merged.mount: Deactivated successfully.
Jan 26 09:41:59 compute-1 podman[81501]: 2026-01-26 09:41:59.032028459 +0000 UTC m=+0.153215884 container remove cd06c5cdd94519ec9b0c636ea140ff9c7b9c570d14bb140c1466a6aadb6fdef9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_curran, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Jan 26 09:41:59 compute-1 systemd[1]: libpod-conmon-cd06c5cdd94519ec9b0c636ea140ff9c7b9c570d14bb140c1466a6aadb6fdef9.scope: Deactivated successfully.
Jan 26 09:41:59 compute-1 sudo[81462]: pam_unix(sudo:session): session closed for user root
Jan 26 09:41:59 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1c scrub starts
Jan 26 09:41:59 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1c scrub ok
Jan 26 09:42:00 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.0 scrub starts
Jan 26 09:42:00 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.0 scrub ok
Jan 26 09:42:00 compute-1 ceph-mon[80107]: 2.4 scrub starts
Jan 26 09:42:00 compute-1 ceph-mon[80107]: 2.4 scrub ok
Jan 26 09:42:00 compute-1 ceph-mon[80107]: 3.4 scrub starts
Jan 26 09:42:00 compute-1 ceph-mon[80107]: 3.4 scrub ok
Jan 26 09:42:00 compute-1 ceph-mon[80107]: 4.1d scrub starts
Jan 26 09:42:00 compute-1 ceph-mon[80107]: 4.1d scrub ok
Jan 26 09:42:00 compute-1 ceph-mon[80107]: 2.1c scrub starts
Jan 26 09:42:00 compute-1 ceph-mon[80107]: 2.1c scrub ok
Jan 26 09:42:00 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:00 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 09:42:00 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 09:42:00 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 09:42:00 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 09:42:00 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e43 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:42:00 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e44 e44: 3 total, 3 up, 3 in
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.18( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[3.1c( empty local-lis/les=0/0 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.1b( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.1c( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.f( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.1( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[3.5( empty local-lis/les=0/0 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[3.3( empty local-lis/les=0/0 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.2( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.7( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[3.a( empty local-lis/les=0/0 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[3.d( empty local-lis/les=0/0 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[3.c( empty local-lis/les=0/0 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.9( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[3.f( empty local-lis/les=0/0 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[3.10( empty local-lis/les=0/0 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.16( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[3.13( empty local-lis/les=0/0 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.15( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[3.14( empty local-lis/les=0/0 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.11( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[3.16( empty local-lis/les=0/0 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.10( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.1f( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.1d( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.759461403s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621505737s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.19( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.927159309s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.789230347s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.18( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.927136421s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.789222717s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.1d( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.759422302s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621505737s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.18( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.927088737s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.789222717s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.19( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.927085876s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.789230347s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.10( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.759243965s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621566772s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.15( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.926724434s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.789070129s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.10( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.759223938s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621566772s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.15( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.926703453s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.789070129s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.13( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.926499367s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.789016724s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.13( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.926483154s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.789016724s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.11( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.759040833s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621620178s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.16( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758980751s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621597290s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.11( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.759001732s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621620178s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.12( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.926407814s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.789062500s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.16( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758960724s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621597290s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.12( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.926392555s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.789062500s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.1f( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.752825737s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.615554810s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.1f( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.752787590s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.615554810s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.10( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925808907s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788749695s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.14( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758676529s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621643066s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.10( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925792694s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788749695s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.13( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.752552032s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.615554810s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.14( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758651733s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621643066s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.f( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925723076s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788749695s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.13( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.752522469s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.615554810s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.f( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925703049s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788749695s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.a( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758640289s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621711731s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.a( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758584976s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621711731s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.e( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925417900s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788589478s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.e( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925391197s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788589478s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.d( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925354004s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788589478s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.b( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758435249s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621688843s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.d( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925333977s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788589478s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.b( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758410454s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621688843s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.c( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925474167s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788848877s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.9( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758394241s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621780396s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.9( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758376122s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621780396s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.e( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758297920s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621726990s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.c( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925447464s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788848877s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.e( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758279800s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621726990s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.b( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925062180s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788543701s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.b( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925036430s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788543701s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.5( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758199692s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621826172s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.5( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758179665s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621826172s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.6( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758061409s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621742249s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.5( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.924700737s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788383484s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.1( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.924477577s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788177490s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.1( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.924451828s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788177490s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.5( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.924640656s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788383484s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.6( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758026123s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621742249s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.4( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758007050s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621833801s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.4( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757991791s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621833801s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.4( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.924235344s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788177490s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.6( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925599098s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.789573669s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.4( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.924210548s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788177490s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.3( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757957458s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621955872s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.6( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925580025s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.789573669s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.3( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757932663s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621955872s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.2( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757944107s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621994019s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.2( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757916451s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621994019s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.9( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.923816681s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788032532s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.f( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757732391s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621994019s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.9( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.923790932s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788032532s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.f( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757705688s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621994019s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.1e( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757695198s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.622009277s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.a( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.923678398s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788002014s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.1e( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757678986s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.622009277s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.a( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.923655510s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788002014s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.1c( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.923947334s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788406372s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.1b( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.923660278s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788131714s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.1c( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.923925400s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788406372s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.1b( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.923631668s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788131714s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.1d( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.914971352s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.779518127s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.18( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757479668s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.622047424s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.1d( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.914941788s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.779518127s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.18( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757456779s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.622047424s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.1b( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757425308s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.622062683s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.1e( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.923221588s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.787895203s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.1b( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757405281s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.622062683s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.1e( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.923195839s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.787895203s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.1f( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.914700508s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.779487610s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.1f( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.914666176s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.779487610s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.8( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.756810188s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621734619s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.8( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.756777763s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621734619s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[6.1a( empty local-lis/les=0/0 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[4.18( empty local-lis/les=0/0 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[6.15( empty local-lis/les=0/0 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[4.13( empty local-lis/les=0/0 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[4.e( empty local-lis/les=0/0 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[6.d( empty local-lis/les=0/0 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[4.d( empty local-lis/les=0/0 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[4.c( empty local-lis/les=0/0 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[6.e( empty local-lis/les=0/0 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[6.2( empty local-lis/les=0/0 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[6.3( empty local-lis/les=0/0 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[4.5( empty local-lis/les=0/0 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[6.7( empty local-lis/les=0/0 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[6.8( empty local-lis/les=0/0 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[4.1b( empty local-lis/les=0/0 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[6.19( empty local-lis/les=0/0 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[4.1a( empty local-lis/les=0/0 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[6.5( empty local-lis/les=0/0 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[6.a( empty local-lis/les=0/0 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[4.a( empty local-lis/les=0/0 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:01 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1a scrub starts
Jan 26 09:42:01 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1a scrub ok
Jan 26 09:42:01 compute-1 sudo[81533]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:42:01 compute-1 sudo[81533]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:01 compute-1 sudo[81533]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:01 compute-1 sudo[81559]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 26 09:42:01 compute-1 sudo[81559]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:01 compute-1 ceph-mon[80107]: 3.8 deep-scrub starts
Jan 26 09:42:01 compute-1 ceph-mon[80107]: 3.8 deep-scrub ok
Jan 26 09:42:01 compute-1 ceph-mon[80107]: 4.1a scrub starts
Jan 26 09:42:01 compute-1 ceph-mon[80107]: pgmap v133: 193 pgs: 193 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:42:01 compute-1 ceph-mon[80107]: 4.1a scrub ok
Jan 26 09:42:01 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 09:42:01 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 09:42:01 compute-1 ceph-mon[80107]: 2.0 scrub starts
Jan 26 09:42:01 compute-1 ceph-mon[80107]: 2.0 scrub ok
Jan 26 09:42:01 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:01 compute-1 ceph-mon[80107]: Reconfiguring mon.compute-2 (monmap changed)...
Jan 26 09:42:01 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 26 09:42:01 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Jan 26 09:42:01 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:42:01 compute-1 ceph-mon[80107]: Reconfiguring daemon mon.compute-2 on compute-2
Jan 26 09:42:01 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 09:42:01 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 09:42:01 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 09:42:01 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 09:42:01 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 09:42:01 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 09:42:01 compute-1 ceph-mon[80107]: osdmap e44: 3 total, 3 up, 3 in
Jan 26 09:42:01 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2724000136' entity='client.admin' 
Jan 26 09:42:01 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:01 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:01 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e45 e45: 3 total, 3 up, 3 in
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[4.1b( empty local-lis/les=44/45 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.7( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[4.18( empty local-lis/les=44/45 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[6.1a( empty local-lis/les=44/45 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.1c( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[4.1a( empty local-lis/les=44/45 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[6.e( empty local-lis/les=44/45 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[4.e( empty local-lis/les=44/45 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[6.d( empty local-lis/les=44/45 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[6.3( empty local-lis/les=44/45 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[6.7( empty local-lis/les=44/45 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[6.2( empty local-lis/les=44/45 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[6.8( empty local-lis/les=44/45 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[3.d( empty local-lis/les=44/45 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[4.c( empty local-lis/les=44/45 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[6.19( empty local-lis/les=44/45 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[4.5( empty local-lis/les=44/45 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[4.d( empty local-lis/les=44/45 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[4.a( empty local-lis/les=44/45 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[3.f( empty local-lis/les=44/45 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.9( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[3.10( empty local-lis/les=44/45 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[3.c( empty local-lis/les=44/45 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.15( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[3.14( empty local-lis/les=44/45 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.16( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[6.15( empty local-lis/les=44/45 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[4.13( empty local-lis/les=44/45 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[3.13( empty local-lis/les=44/45 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.11( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.10( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[3.16( empty local-lis/les=44/45 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.1f( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[6.a( empty local-lis/les=44/45 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[3.a( empty local-lis/les=44/45 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.2( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[6.5( empty local-lis/les=44/45 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[3.3( empty local-lis/les=44/45 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.f( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.1( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[3.5( empty local-lis/les=44/45 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[3.1c( empty local-lis/les=44/45 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.18( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.1b( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:01 compute-1 podman[81653]: 2026-01-26 09:42:01.84591933 +0000 UTC m=+0.072795076 container exec 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 09:42:01 compute-1 podman[81653]: 2026-01-26 09:42:01.972227838 +0000 UTC m=+0.199103584 container exec_died 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, ceph=True)
Jan 26 09:42:02 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.1c scrub starts
Jan 26 09:42:02 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.1c scrub ok
Jan 26 09:42:02 compute-1 sudo[81559]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:02 compute-1 ceph-mon[80107]: 3.5 scrub starts
Jan 26 09:42:02 compute-1 ceph-mon[80107]: 3.5 scrub ok
Jan 26 09:42:02 compute-1 ceph-mon[80107]: 4.5 scrub starts
Jan 26 09:42:02 compute-1 ceph-mon[80107]: 4.5 scrub ok
Jan 26 09:42:02 compute-1 ceph-mon[80107]: 2.1a scrub starts
Jan 26 09:42:02 compute-1 ceph-mon[80107]: 2.1a scrub ok
Jan 26 09:42:02 compute-1 ceph-mon[80107]: 3.1b scrub starts
Jan 26 09:42:02 compute-1 ceph-mon[80107]: 3.1b scrub ok
Jan 26 09:42:02 compute-1 ceph-mon[80107]: 4.17 scrub starts
Jan 26 09:42:02 compute-1 ceph-mon[80107]: pgmap v135: 193 pgs: 193 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:42:02 compute-1 ceph-mon[80107]: 4.17 scrub ok
Jan 26 09:42:02 compute-1 ceph-mon[80107]: osdmap e45: 3 total, 3 up, 3 in
Jan 26 09:42:02 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:02 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:02 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:02 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:02 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:42:02 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 09:42:02 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:02 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 09:42:02 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 09:42:02 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:42:02 compute-1 sshd-session[81532]: Connection closed by authenticating user root 152.42.136.110 port 41468 [preauth]
Jan 26 09:42:03 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.17 scrub starts
Jan 26 09:42:03 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.17 scrub ok
Jan 26 09:42:03 compute-1 sudo[81762]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alxeniukezhjrtcwkaiudlzkpgxpxvce ; /usr/bin/python3'
Jan 26 09:42:03 compute-1 sudo[81762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:42:03 compute-1 python3[81764]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a -f 'name=ceph-?(.*)-mgr.*' --format \{\{\.Command\}\} --no-trunc
                                           _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:42:03 compute-1 sudo[81762]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:03 compute-1 ceph-mon[80107]: 7.1c scrub starts
Jan 26 09:42:03 compute-1 ceph-mon[80107]: 7.1c scrub ok
Jan 26 09:42:03 compute-1 ceph-mon[80107]: 3.1a scrub starts
Jan 26 09:42:03 compute-1 ceph-mon[80107]: 3.1a scrub ok
Jan 26 09:42:03 compute-1 ceph-mon[80107]: 4.16 scrub starts
Jan 26 09:42:03 compute-1 ceph-mon[80107]: 4.16 scrub ok
Jan 26 09:42:03 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/308369494' entity='client.admin' 
Jan 26 09:42:04 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.12 scrub starts
Jan 26 09:42:04 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.12 scrub ok
Jan 26 09:42:04 compute-1 ceph-mon[80107]: 2.17 scrub starts
Jan 26 09:42:04 compute-1 ceph-mon[80107]: 2.17 scrub ok
Jan 26 09:42:04 compute-1 ceph-mon[80107]: 3.15 scrub starts
Jan 26 09:42:04 compute-1 ceph-mon[80107]: 3.15 scrub ok
Jan 26 09:42:04 compute-1 ceph-mon[80107]: 6.14 scrub starts
Jan 26 09:42:04 compute-1 ceph-mon[80107]: 6.14 scrub ok
Jan 26 09:42:04 compute-1 ceph-mon[80107]: pgmap v137: 193 pgs: 193 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:42:04 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1533391079' entity='client.admin' 
Jan 26 09:42:05 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.16 scrub starts
Jan 26 09:42:05 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.16 scrub ok
Jan 26 09:42:05 compute-1 ceph-mon[80107]: 7.12 scrub starts
Jan 26 09:42:05 compute-1 ceph-mon[80107]: 7.12 scrub ok
Jan 26 09:42:05 compute-1 ceph-mon[80107]: 5.13 scrub starts
Jan 26 09:42:05 compute-1 ceph-mon[80107]: 5.13 scrub ok
Jan 26 09:42:05 compute-1 ceph-mon[80107]: 6.16 scrub starts
Jan 26 09:42:05 compute-1 ceph-mon[80107]: 6.16 scrub ok
Jan 26 09:42:05 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e45 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:42:06 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.14 scrub starts
Jan 26 09:42:06 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.14 scrub ok
Jan 26 09:42:07 compute-1 ceph-mon[80107]: 2.16 scrub starts
Jan 26 09:42:07 compute-1 ceph-mon[80107]: 2.16 scrub ok
Jan 26 09:42:07 compute-1 ceph-mon[80107]: 5.12 scrub starts
Jan 26 09:42:07 compute-1 ceph-mon[80107]: 5.12 scrub ok
Jan 26 09:42:07 compute-1 ceph-mon[80107]: 4.12 scrub starts
Jan 26 09:42:07 compute-1 ceph-mon[80107]: 4.12 scrub ok
Jan 26 09:42:07 compute-1 ceph-mon[80107]: pgmap v138: 193 pgs: 193 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:42:07 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1831666463' entity='client.admin' 
Jan 26 09:42:07 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.17 scrub starts
Jan 26 09:42:07 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.17 scrub ok
Jan 26 09:42:08 compute-1 ceph-mon[80107]: 2.14 scrub starts
Jan 26 09:42:08 compute-1 ceph-mon[80107]: 2.14 scrub ok
Jan 26 09:42:08 compute-1 ceph-mon[80107]: 3.11 deep-scrub starts
Jan 26 09:42:08 compute-1 ceph-mon[80107]: 3.11 deep-scrub ok
Jan 26 09:42:08 compute-1 ceph-mon[80107]: 6.10 scrub starts
Jan 26 09:42:08 compute-1 ceph-mon[80107]: 6.10 scrub ok
Jan 26 09:42:08 compute-1 ceph-mon[80107]: 7.17 scrub starts
Jan 26 09:42:08 compute-1 ceph-mon[80107]: 7.17 scrub ok
Jan 26 09:42:08 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:08 compute-1 ceph-mon[80107]: pgmap v139: 193 pgs: 193 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:42:08 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.11 scrub starts
Jan 26 09:42:08 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.11 scrub ok
Jan 26 09:42:09 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.15 scrub starts
Jan 26 09:42:09 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.15 scrub ok
Jan 26 09:42:09 compute-1 ceph-mon[80107]: 3.e scrub starts
Jan 26 09:42:09 compute-1 ceph-mon[80107]: 3.e scrub ok
Jan 26 09:42:09 compute-1 ceph-mon[80107]: 4.11 deep-scrub starts
Jan 26 09:42:09 compute-1 ceph-mon[80107]: 4.11 deep-scrub ok
Jan 26 09:42:09 compute-1 ceph-mon[80107]: 2.11 scrub starts
Jan 26 09:42:09 compute-1 ceph-mon[80107]: 2.11 scrub ok
Jan 26 09:42:09 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1606551457' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Jan 26 09:42:09 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:09 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:09 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.fgzdbm", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 26 09:42:09 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.fgzdbm", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 26 09:42:09 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:09 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:42:09 compute-1 ceph-mon[80107]: Deploying daemon rgw.rgw.compute-2.fgzdbm on compute-2
Jan 26 09:42:10 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.3 scrub starts
Jan 26 09:42:10 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.3 scrub ok
Jan 26 09:42:10 compute-1 ceph-mon[80107]: 5.8 scrub starts
Jan 26 09:42:10 compute-1 ceph-mon[80107]: 5.8 scrub ok
Jan 26 09:42:10 compute-1 ceph-mon[80107]: 6.11 scrub starts
Jan 26 09:42:10 compute-1 ceph-mon[80107]: 6.11 scrub ok
Jan 26 09:42:10 compute-1 ceph-mon[80107]: 7.15 scrub starts
Jan 26 09:42:10 compute-1 ceph-mon[80107]: 7.15 scrub ok
Jan 26 09:42:10 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1606551457' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Jan 26 09:42:10 compute-1 ceph-mon[80107]: mgrmap e12: compute-0.zllcia(active, since 3m), standbys: compute-2.oynaeu, compute-1.xammti
Jan 26 09:42:10 compute-1 ceph-mon[80107]: 5.b scrub starts
Jan 26 09:42:10 compute-1 ceph-mon[80107]: 5.b scrub ok
Jan 26 09:42:10 compute-1 ceph-mon[80107]: pgmap v140: 193 pgs: 193 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:42:10 compute-1 sudo[81778]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:42:10 compute-1 sudo[81778]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:10 compute-1 sudo[81778]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:10 compute-1 sudo[81803]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:42:10 compute-1 sudo[81803]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:10 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e45 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:42:10 compute-1 podman[81870]: 2026-01-26 09:42:10.794671323 +0000 UTC m=+0.037591151 container create 77b1d608585c1a0bb7217e12a78800b2a94c3442ae259ee727757c8d7d00aeb0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cranky_yonath, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 09:42:10 compute-1 systemd[1]: Started libpod-conmon-77b1d608585c1a0bb7217e12a78800b2a94c3442ae259ee727757c8d7d00aeb0.scope.
Jan 26 09:42:10 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:42:10 compute-1 podman[81870]: 2026-01-26 09:42:10.860548228 +0000 UTC m=+0.103468076 container init 77b1d608585c1a0bb7217e12a78800b2a94c3442ae259ee727757c8d7d00aeb0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cranky_yonath, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 09:42:10 compute-1 podman[81870]: 2026-01-26 09:42:10.865910997 +0000 UTC m=+0.108830825 container start 77b1d608585c1a0bb7217e12a78800b2a94c3442ae259ee727757c8d7d00aeb0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cranky_yonath, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 09:42:10 compute-1 podman[81870]: 2026-01-26 09:42:10.868628442 +0000 UTC m=+0.111548290 container attach 77b1d608585c1a0bb7217e12a78800b2a94c3442ae259ee727757c8d7d00aeb0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cranky_yonath, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Jan 26 09:42:10 compute-1 cranky_yonath[81886]: 167 167
Jan 26 09:42:10 compute-1 systemd[1]: libpod-77b1d608585c1a0bb7217e12a78800b2a94c3442ae259ee727757c8d7d00aeb0.scope: Deactivated successfully.
Jan 26 09:42:10 compute-1 podman[81870]: 2026-01-26 09:42:10.87039615 +0000 UTC m=+0.113315978 container died 77b1d608585c1a0bb7217e12a78800b2a94c3442ae259ee727757c8d7d00aeb0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cranky_yonath, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_REF=squid, ceph=True)
Jan 26 09:42:10 compute-1 podman[81870]: 2026-01-26 09:42:10.778722082 +0000 UTC m=+0.021641930 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:42:10 compute-1 systemd[1]: var-lib-containers-storage-overlay-91892cd777ceefaf9d1f2af3a86edfb96691207ba4586a4cd92f4d1bb58e0f86-merged.mount: Deactivated successfully.
Jan 26 09:42:10 compute-1 podman[81870]: 2026-01-26 09:42:10.899717823 +0000 UTC m=+0.142637651 container remove 77b1d608585c1a0bb7217e12a78800b2a94c3442ae259ee727757c8d7d00aeb0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cranky_yonath, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0)
Jan 26 09:42:10 compute-1 systemd[1]: libpod-conmon-77b1d608585c1a0bb7217e12a78800b2a94c3442ae259ee727757c8d7d00aeb0.scope: Deactivated successfully.
Jan 26 09:42:10 compute-1 systemd[1]: Reloading.
Jan 26 09:42:11 compute-1 systemd-rc-local-generator[81924]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:42:11 compute-1 systemd-sysv-generator[81931]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:42:11 compute-1 ceph-mon[80107]: 6.13 scrub starts
Jan 26 09:42:11 compute-1 ceph-mon[80107]: 6.13 scrub ok
Jan 26 09:42:11 compute-1 ceph-mon[80107]: 2.3 scrub starts
Jan 26 09:42:11 compute-1 ceph-mon[80107]: 2.3 scrub ok
Jan 26 09:42:11 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/588199508' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Jan 26 09:42:11 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:11 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:11 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:11 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.fbcidm", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 26 09:42:11 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.fbcidm", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 26 09:42:11 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:11 compute-1 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:42:11 compute-1 ceph-mon[80107]: Deploying daemon rgw.rgw.compute-1.fbcidm on compute-1
Jan 26 09:42:11 compute-1 ceph-mon[80107]: 3.0 scrub starts
Jan 26 09:42:11 compute-1 ceph-mon[80107]: 3.0 scrub ok
Jan 26 09:42:11 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.0 scrub starts
Jan 26 09:42:11 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.0 scrub ok
Jan 26 09:42:11 compute-1 ceph-mgr[80416]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 26 09:42:11 compute-1 ceph-mgr[80416]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 26 09:42:11 compute-1 ceph-mgr[80416]: mgr respawn  0: '/usr/bin/ceph-mgr'
Jan 26 09:42:11 compute-1 ceph-mgr[80416]: mgr respawn  1: '-n'
Jan 26 09:42:11 compute-1 ceph-mgr[80416]: mgr respawn  2: 'mgr.compute-1.xammti'
Jan 26 09:42:11 compute-1 ceph-mgr[80416]: mgr respawn  3: '-f'
Jan 26 09:42:11 compute-1 ceph-mgr[80416]: mgr respawn  4: '--setuser'
Jan 26 09:42:11 compute-1 ceph-mgr[80416]: mgr respawn  5: 'ceph'
Jan 26 09:42:11 compute-1 ceph-mgr[80416]: mgr respawn  6: '--setgroup'
Jan 26 09:42:11 compute-1 ceph-mgr[80416]: mgr respawn  7: 'ceph'
Jan 26 09:42:11 compute-1 ceph-mgr[80416]: mgr respawn  8: '--default-log-to-file=false'
Jan 26 09:42:11 compute-1 ceph-mgr[80416]: mgr respawn  9: '--default-log-to-journald=true'
Jan 26 09:42:11 compute-1 ceph-mgr[80416]: mgr respawn  10: '--default-log-to-stderr=false'
Jan 26 09:42:11 compute-1 ceph-mgr[80416]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Jan 26 09:42:11 compute-1 ceph-mgr[80416]: mgr respawn  exe_path /proc/self/exe
Jan 26 09:42:11 compute-1 systemd[1]: Reloading.
Jan 26 09:42:11 compute-1 sshd-session[73020]: Connection closed by 192.168.122.100 port 33886
Jan 26 09:42:11 compute-1 sshd-session[72877]: Connection closed by 192.168.122.100 port 33854
Jan 26 09:42:11 compute-1 sshd-session[72848]: Connection closed by 192.168.122.100 port 33842
Jan 26 09:42:11 compute-1 sshd-session[72991]: Connection closed by 192.168.122.100 port 33880
Jan 26 09:42:11 compute-1 sshd-session[72906]: Connection closed by 192.168.122.100 port 33862
Jan 26 09:42:11 compute-1 sshd-session[72732]: Connection closed by 192.168.122.100 port 33820
Jan 26 09:42:11 compute-1 sshd-session[72964]: Connection closed by 192.168.122.100 port 33868
Jan 26 09:42:11 compute-1 sshd-session[72819]: Connection closed by 192.168.122.100 port 33836
Jan 26 09:42:11 compute-1 sshd-session[72761]: Connection closed by 192.168.122.100 port 33822
Jan 26 09:42:11 compute-1 sshd-session[72935]: Connection closed by 192.168.122.100 port 33866
Jan 26 09:42:11 compute-1 sshd-session[72790]: Connection closed by 192.168.122.100 port 33826
Jan 26 09:42:11 compute-1 sshd-session[72731]: Connection closed by 192.168.122.100 port 33812
Jan 26 09:42:11 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e46 e46: 3 total, 3 up, 3 in
Jan 26 09:42:11 compute-1 systemd-rc-local-generator[81968]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:42:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: ignoring --setuser ceph since I am not root
Jan 26 09:42:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: ignoring --setgroup ceph since I am not root
Jan 26 09:42:11 compute-1 systemd-sysv-generator[81972]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:42:11 compute-1 ceph-mgr[80416]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 26 09:42:11 compute-1 ceph-mgr[80416]: pidfile_write: ignore empty --pid-file
Jan 26 09:42:11 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'alerts'
Jan 26 09:42:11 compute-1 ceph-mgr[80416]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 26 09:42:11 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'balancer'
Jan 26 09:42:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:11.436+0000 7f4bb702e140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 26 09:42:11 compute-1 sshd-session[72874]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 26 09:42:11 compute-1 sshd-session[72758]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 26 09:42:11 compute-1 sshd-session[73017]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 26 09:42:11 compute-1 sshd-session[72816]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 26 09:42:11 compute-1 sshd-session[72726]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 26 09:42:11 compute-1 systemd[1]: session-27.scope: Deactivated successfully.
Jan 26 09:42:11 compute-1 sshd-session[72988]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 26 09:42:11 compute-1 systemd[1]: session-23.scope: Deactivated successfully.
Jan 26 09:42:11 compute-1 sshd-session[72903]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 26 09:42:11 compute-1 systemd[1]: session-25.scope: Deactivated successfully.
Jan 26 09:42:11 compute-1 sshd-session[72932]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 26 09:42:11 compute-1 systemd-logind[783]: Removed session 25.
Jan 26 09:42:11 compute-1 sshd-session[72709]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 26 09:42:11 compute-1 systemd[1]: session-22.scope: Deactivated successfully.
Jan 26 09:42:11 compute-1 sshd-session[72845]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 26 09:42:11 compute-1 systemd[1]: session-20.scope: Deactivated successfully.
Jan 26 09:42:11 compute-1 sshd-session[72961]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 26 09:42:11 compute-1 systemd-logind[783]: Removed session 22.
Jan 26 09:42:11 compute-1 sshd-session[72787]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 26 09:42:11 compute-1 systemd[1]: session-31.scope: Deactivated successfully.
Jan 26 09:42:11 compute-1 systemd[1]: session-29.scope: Deactivated successfully.
Jan 26 09:42:11 compute-1 systemd[1]: session-30.scope: Deactivated successfully.
Jan 26 09:42:11 compute-1 systemd-logind[783]: Session 27 logged out. Waiting for processes to exit.
Jan 26 09:42:11 compute-1 systemd-logind[783]: Session 32 logged out. Waiting for processes to exit.
Jan 26 09:42:11 compute-1 systemd-logind[783]: Session 20 logged out. Waiting for processes to exit.
Jan 26 09:42:11 compute-1 systemd[1]: session-28.scope: Deactivated successfully.
Jan 26 09:42:11 compute-1 systemd-logind[783]: Session 29 logged out. Waiting for processes to exit.
Jan 26 09:42:11 compute-1 systemd[1]: session-24.scope: Deactivated successfully.
Jan 26 09:42:11 compute-1 systemd[1]: session-26.scope: Deactivated successfully.
Jan 26 09:42:11 compute-1 systemd-logind[783]: Session 31 logged out. Waiting for processes to exit.
Jan 26 09:42:11 compute-1 systemd[1]: Starting Ceph rgw.rgw.compute-1.fbcidm for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 09:42:11 compute-1 systemd-logind[783]: Session 23 logged out. Waiting for processes to exit.
Jan 26 09:42:11 compute-1 systemd-logind[783]: Session 30 logged out. Waiting for processes to exit.
Jan 26 09:42:11 compute-1 systemd-logind[783]: Session 28 logged out. Waiting for processes to exit.
Jan 26 09:42:11 compute-1 systemd-logind[783]: Session 24 logged out. Waiting for processes to exit.
Jan 26 09:42:11 compute-1 systemd-logind[783]: Session 26 logged out. Waiting for processes to exit.
Jan 26 09:42:11 compute-1 systemd-logind[783]: Removed session 27.
Jan 26 09:42:11 compute-1 systemd-logind[783]: Removed session 23.
Jan 26 09:42:11 compute-1 systemd-logind[783]: Removed session 20.
Jan 26 09:42:11 compute-1 systemd-logind[783]: Removed session 31.
Jan 26 09:42:11 compute-1 systemd-logind[783]: Removed session 29.
Jan 26 09:42:11 compute-1 systemd-logind[783]: Removed session 30.
Jan 26 09:42:11 compute-1 systemd-logind[783]: Removed session 28.
Jan 26 09:42:11 compute-1 systemd-logind[783]: Removed session 24.
Jan 26 09:42:11 compute-1 systemd-logind[783]: Removed session 26.
Jan 26 09:42:11 compute-1 ceph-mgr[80416]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 26 09:42:11 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'cephadm'
Jan 26 09:42:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:11.552+0000 7f4bb702e140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 26 09:42:11 compute-1 podman[82045]: 2026-01-26 09:42:11.742941053 +0000 UTC m=+0.038283771 container create 4dd5c8bd1cc856abf0889650d4a4346dea8cfb7a23592a8467268a3371cecad4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-rgw-rgw-compute-1-fbcidm, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 09:42:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef533ef4332b913a823ed51d28459eea8cb3456240c3cd6ba607a173abcf5e0a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 09:42:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef533ef4332b913a823ed51d28459eea8cb3456240c3cd6ba607a173abcf5e0a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 09:42:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef533ef4332b913a823ed51d28459eea8cb3456240c3cd6ba607a173abcf5e0a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 09:42:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef533ef4332b913a823ed51d28459eea8cb3456240c3cd6ba607a173abcf5e0a/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-1.fbcidm supports timestamps until 2038 (0x7fffffff)
Jan 26 09:42:11 compute-1 podman[82045]: 2026-01-26 09:42:11.809327522 +0000 UTC m=+0.104670260 container init 4dd5c8bd1cc856abf0889650d4a4346dea8cfb7a23592a8467268a3371cecad4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-rgw-rgw-compute-1-fbcidm, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 09:42:11 compute-1 podman[82045]: 2026-01-26 09:42:11.814047322 +0000 UTC m=+0.109390040 container start 4dd5c8bd1cc856abf0889650d4a4346dea8cfb7a23592a8467268a3371cecad4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-rgw-rgw-compute-1-fbcidm, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 09:42:11 compute-1 bash[82045]: 4dd5c8bd1cc856abf0889650d4a4346dea8cfb7a23592a8467268a3371cecad4
Jan 26 09:42:11 compute-1 podman[82045]: 2026-01-26 09:42:11.726645972 +0000 UTC m=+0.021988690 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:42:11 compute-1 systemd[1]: Started Ceph rgw.rgw.compute-1.fbcidm for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:42:11 compute-1 radosgw[82065]: deferred set uid:gid to 167:167 (ceph:ceph)
Jan 26 09:42:11 compute-1 radosgw[82065]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process radosgw, pid 2
Jan 26 09:42:11 compute-1 radosgw[82065]: framework: beast
Jan 26 09:42:11 compute-1 radosgw[82065]: framework conf key: endpoint, val: 192.168.122.101:8082
Jan 26 09:42:11 compute-1 radosgw[82065]: init_numa not setting numa affinity
Jan 26 09:42:11 compute-1 sudo[81803]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:11 compute-1 systemd[1]: session-32.scope: Deactivated successfully.
Jan 26 09:42:11 compute-1 systemd[1]: session-32.scope: Consumed 1min 37.906s CPU time.
Jan 26 09:42:11 compute-1 systemd-logind[783]: Removed session 32.
Jan 26 09:42:12 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.2 deep-scrub starts
Jan 26 09:42:12 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.2 deep-scrub ok
Jan 26 09:42:12 compute-1 ceph-mon[80107]: 4.10 scrub starts
Jan 26 09:42:12 compute-1 ceph-mon[80107]: 4.10 scrub ok
Jan 26 09:42:12 compute-1 ceph-mon[80107]: 7.0 scrub starts
Jan 26 09:42:12 compute-1 ceph-mon[80107]: 7.0 scrub ok
Jan 26 09:42:12 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/588199508' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Jan 26 09:42:12 compute-1 ceph-mon[80107]: mgrmap e13: compute-0.zllcia(active, since 3m), standbys: compute-2.oynaeu, compute-1.xammti
Jan 26 09:42:12 compute-1 ceph-mon[80107]: osdmap e46: 3 total, 3 up, 3 in
Jan 26 09:42:12 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2891557756' entity='client.rgw.rgw.compute-2.fgzdbm' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 26 09:42:12 compute-1 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-2.fgzdbm' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 26 09:42:12 compute-1 ceph-mon[80107]: 5.0 scrub starts
Jan 26 09:42:12 compute-1 ceph-mon[80107]: 5.0 scrub ok
Jan 26 09:42:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e47 e47: 3 total, 3 up, 3 in
Jan 26 09:42:12 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'crash'
Jan 26 09:42:12 compute-1 ceph-mgr[80416]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 26 09:42:12 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'dashboard'
Jan 26 09:42:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:12.457+0000 7f4bb702e140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 26 09:42:13 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'devicehealth'
Jan 26 09:42:13 compute-1 ceph-mgr[80416]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 26 09:42:13 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'diskprediction_local'
Jan 26 09:42:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:13.138+0000 7f4bb702e140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 26 09:42:13 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.7 scrub starts
Jan 26 09:42:13 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.7 scrub ok
Jan 26 09:42:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 26 09:42:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 26 09:42:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]:   from numpy import show_config as show_numpy_config
Jan 26 09:42:13 compute-1 ceph-mgr[80416]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 26 09:42:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:13.329+0000 7f4bb702e140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 26 09:42:13 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'influx'
Jan 26 09:42:13 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e48 e48: 3 total, 3 up, 3 in
Jan 26 09:42:13 compute-1 ceph-mon[80107]: 4.f scrub starts
Jan 26 09:42:13 compute-1 ceph-mon[80107]: 4.f scrub ok
Jan 26 09:42:13 compute-1 ceph-mon[80107]: 2.2 deep-scrub starts
Jan 26 09:42:13 compute-1 ceph-mon[80107]: 2.2 deep-scrub ok
Jan 26 09:42:13 compute-1 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-2.fgzdbm' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Jan 26 09:42:13 compute-1 ceph-mon[80107]: osdmap e47: 3 total, 3 up, 3 in
Jan 26 09:42:13 compute-1 ceph-mon[80107]: 5.4 scrub starts
Jan 26 09:42:13 compute-1 ceph-mon[80107]: 5.4 scrub ok
Jan 26 09:42:13 compute-1 ceph-mgr[80416]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 26 09:42:13 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'insights'
Jan 26 09:42:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:13.408+0000 7f4bb702e140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 26 09:42:13 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'iostat'
Jan 26 09:42:13 compute-1 ceph-mgr[80416]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 26 09:42:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:13.553+0000 7f4bb702e140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 26 09:42:13 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'k8sevents'
Jan 26 09:42:13 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'localpool'
Jan 26 09:42:14 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'mds_autoscaler'
Jan 26 09:42:14 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.1 scrub starts
Jan 26 09:42:14 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.1 scrub ok
Jan 26 09:42:14 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'mirroring'
Jan 26 09:42:14 compute-1 ceph-mon[80107]: 6.c scrub starts
Jan 26 09:42:14 compute-1 ceph-mon[80107]: 6.c scrub ok
Jan 26 09:42:14 compute-1 ceph-mon[80107]: 7.7 scrub starts
Jan 26 09:42:14 compute-1 ceph-mon[80107]: 7.7 scrub ok
Jan 26 09:42:14 compute-1 ceph-mon[80107]: osdmap e48: 3 total, 3 up, 3 in
Jan 26 09:42:14 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1812478715' entity='client.rgw.rgw.compute-2.fgzdbm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 26 09:42:14 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/992292627' entity='client.rgw.rgw.compute-1.fbcidm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 26 09:42:14 compute-1 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-2.fgzdbm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 26 09:42:14 compute-1 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-1.fbcidm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 26 09:42:14 compute-1 ceph-mon[80107]: 5.e scrub starts
Jan 26 09:42:14 compute-1 ceph-mon[80107]: 5.e scrub ok
Jan 26 09:42:14 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e49 e49: 3 total, 3 up, 3 in
Jan 26 09:42:14 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'nfs'
Jan 26 09:42:14 compute-1 ceph-mgr[80416]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 26 09:42:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:14.724+0000 7f4bb702e140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 26 09:42:14 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'orchestrator'
Jan 26 09:42:14 compute-1 ceph-mgr[80416]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 26 09:42:14 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'osd_perf_query'
Jan 26 09:42:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:14.961+0000 7f4bb702e140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 26 09:42:15 compute-1 ceph-mgr[80416]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 26 09:42:15 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'osd_support'
Jan 26 09:42:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:15.050+0000 7f4bb702e140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 26 09:42:15 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.d scrub starts
Jan 26 09:42:15 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.d scrub ok
Jan 26 09:42:15 compute-1 ceph-mgr[80416]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 26 09:42:15 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'pg_autoscaler'
Jan 26 09:42:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:15.122+0000 7f4bb702e140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 26 09:42:15 compute-1 ceph-mgr[80416]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 26 09:42:15 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'progress'
Jan 26 09:42:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:15.205+0000 7f4bb702e140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 26 09:42:15 compute-1 ceph-mgr[80416]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 26 09:42:15 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'prometheus'
Jan 26 09:42:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:15.276+0000 7f4bb702e140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 26 09:42:15 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e50 e50: 3 total, 3 up, 3 in
Jan 26 09:42:15 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 50 pg[10.0( empty local-lis/les=0/0 n=0 ec=50/50 lis/c=0/0 les/c/f=0/0/0 sis=50) [1] r=0 lpr=50 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:15 compute-1 ceph-mon[80107]: 6.f scrub starts
Jan 26 09:42:15 compute-1 ceph-mon[80107]: 6.f scrub ok
Jan 26 09:42:15 compute-1 ceph-mon[80107]: 7.1 scrub starts
Jan 26 09:42:15 compute-1 ceph-mon[80107]: 7.1 scrub ok
Jan 26 09:42:15 compute-1 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-2.fgzdbm' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Jan 26 09:42:15 compute-1 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-1.fbcidm' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Jan 26 09:42:15 compute-1 ceph-mon[80107]: osdmap e49: 3 total, 3 up, 3 in
Jan 26 09:42:15 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e50 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:42:15 compute-1 ceph-mgr[80416]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 26 09:42:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:15.655+0000 7f4bb702e140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 26 09:42:15 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'rbd_support'
Jan 26 09:42:15 compute-1 ceph-mgr[80416]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 26 09:42:15 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'restful'
Jan 26 09:42:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:15.758+0000 7f4bb702e140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 26 09:42:15 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'rgw'
Jan 26 09:42:16 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.c scrub starts
Jan 26 09:42:16 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.c scrub ok
Jan 26 09:42:16 compute-1 ceph-mgr[80416]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 26 09:42:16 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'rook'
Jan 26 09:42:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:16.253+0000 7f4bb702e140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 26 09:42:16 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e51 e51: 3 total, 3 up, 3 in
Jan 26 09:42:16 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 51 pg[10.0( empty local-lis/les=50/51 n=0 ec=50/50 lis/c=0/0 les/c/f=0/0/0 sis=50) [1] r=0 lpr=50 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:16 compute-1 ceph-mon[80107]: 5.d scrub starts
Jan 26 09:42:16 compute-1 ceph-mon[80107]: 5.d scrub ok
Jan 26 09:42:16 compute-1 ceph-mon[80107]: 4.0 scrub starts
Jan 26 09:42:16 compute-1 ceph-mon[80107]: 4.0 scrub ok
Jan 26 09:42:16 compute-1 ceph-mon[80107]: 7.d scrub starts
Jan 26 09:42:16 compute-1 ceph-mon[80107]: 7.d scrub ok
Jan 26 09:42:16 compute-1 ceph-mon[80107]: 5.1a scrub starts
Jan 26 09:42:16 compute-1 ceph-mon[80107]: 5.1a scrub ok
Jan 26 09:42:16 compute-1 ceph-mon[80107]: osdmap e50: 3 total, 3 up, 3 in
Jan 26 09:42:16 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/992292627' entity='client.rgw.rgw.compute-1.fbcidm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 26 09:42:16 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1812478715' entity='client.rgw.rgw.compute-2.fgzdbm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 26 09:42:16 compute-1 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-1.fbcidm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 26 09:42:16 compute-1 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-2.fgzdbm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 26 09:42:16 compute-1 ceph-mon[80107]: 6.0 scrub starts
Jan 26 09:42:16 compute-1 ceph-mon[80107]: 6.0 scrub ok
Jan 26 09:42:16 compute-1 ceph-mon[80107]: 6.1e scrub starts
Jan 26 09:42:16 compute-1 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-1.fbcidm' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 26 09:42:16 compute-1 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-2.fgzdbm' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 26 09:42:16 compute-1 ceph-mon[80107]: osdmap e51: 3 total, 3 up, 3 in
Jan 26 09:42:16 compute-1 ceph-mgr[80416]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 26 09:42:16 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'selftest'
Jan 26 09:42:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:16.890+0000 7f4bb702e140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 26 09:42:16 compute-1 ceph-mgr[80416]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 26 09:42:16 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'snap_schedule'
Jan 26 09:42:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:16.968+0000 7f4bb702e140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 26 09:42:17 compute-1 ceph-mgr[80416]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 26 09:42:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:17.071+0000 7f4bb702e140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 26 09:42:17 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'stats'
Jan 26 09:42:17 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.19 scrub starts
Jan 26 09:42:17 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.19 scrub ok
Jan 26 09:42:17 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'status'
Jan 26 09:42:17 compute-1 ceph-mgr[80416]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 26 09:42:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:17.253+0000 7f4bb702e140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 26 09:42:17 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'telegraf'
Jan 26 09:42:17 compute-1 ceph-mgr[80416]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 26 09:42:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:17.335+0000 7f4bb702e140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 26 09:42:17 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'telemetry'
Jan 26 09:42:17 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e52 e52: 3 total, 3 up, 3 in
Jan 26 09:42:17 compute-1 ceph-mon[80107]: 7.c scrub starts
Jan 26 09:42:17 compute-1 ceph-mon[80107]: 7.c scrub ok
Jan 26 09:42:17 compute-1 ceph-mon[80107]: 6.1e scrub ok
Jan 26 09:42:17 compute-1 ceph-mon[80107]: 4.4 deep-scrub starts
Jan 26 09:42:17 compute-1 ceph-mon[80107]: 4.4 deep-scrub ok
Jan 26 09:42:17 compute-1 ceph-mon[80107]: osdmap e52: 3 total, 3 up, 3 in
Jan 26 09:42:17 compute-1 ceph-mgr[80416]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 26 09:42:17 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'test_orchestrator'
Jan 26 09:42:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:17.527+0000 7f4bb702e140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 26 09:42:17 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e53 e53: 3 total, 3 up, 3 in
Jan 26 09:42:17 compute-1 ceph-mgr[80416]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 26 09:42:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:17.776+0000 7f4bb702e140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 26 09:42:17 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'volumes'
Jan 26 09:42:18 compute-1 ceph-mgr[80416]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 26 09:42:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:18.065+0000 7f4bb702e140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 26 09:42:18 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'zabbix'
Jan 26 09:42:18 compute-1 sshd-session[82664]: Accepted publickey for ceph-admin from 192.168.122.100 port 42160 ssh2: RSA SHA256:cGz1g5qmzBfeiAiDRElnaAonZh1cdMIZMAXyGkEzbws
Jan 26 09:42:18 compute-1 systemd-logind[783]: New session 33 of user ceph-admin.
Jan 26 09:42:18 compute-1 systemd[1]: Started Session 33 of User ceph-admin.
Jan 26 09:42:18 compute-1 sshd-session[82664]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 26 09:42:18 compute-1 ceph-mgr[80416]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 26 09:42:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:18.144+0000 7f4bb702e140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 26 09:42:18 compute-1 ceph-mgr[80416]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 26 09:42:18 compute-1 ceph-mgr[80416]: mgr load Constructed class from module: dashboard
Jan 26 09:42:18 compute-1 ceph-mgr[80416]: [dashboard INFO root] server: ssl=no host=192.168.122.101 port=8443
Jan 26 09:42:18 compute-1 ceph-mgr[80416]: [dashboard INFO root] Configured CherryPy, starting engine...
Jan 26 09:42:18 compute-1 ceph-mgr[80416]: [dashboard INFO root] Starting engine...
Jan 26 09:42:18 compute-1 ceph-mgr[80416]: ms_deliver_dispatch: unhandled message 0x559dd25af860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Jan 26 09:42:18 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.1a scrub starts
Jan 26 09:42:18 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.1a scrub ok
Jan 26 09:42:18 compute-1 sudo[82678]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:42:18 compute-1 sudo[82678]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:18 compute-1 sudo[82678]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:18 compute-1 ceph-mgr[80416]: [dashboard INFO root] Engine started...
Jan 26 09:42:18 compute-1 sudo[82705]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 26 09:42:18 compute-1 sudo[82705]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:18 compute-1 ceph-mon[80107]: 7.19 scrub starts
Jan 26 09:42:18 compute-1 ceph-mon[80107]: 7.19 scrub ok
Jan 26 09:42:18 compute-1 ceph-mon[80107]: 4.1c scrub starts
Jan 26 09:42:18 compute-1 ceph-mon[80107]: 4.1c scrub ok
Jan 26 09:42:18 compute-1 ceph-mon[80107]: Standby manager daemon compute-2.oynaeu restarted
Jan 26 09:42:18 compute-1 ceph-mon[80107]: Standby manager daemon compute-2.oynaeu started
Jan 26 09:42:18 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/992292627' entity='client.rgw.rgw.compute-1.fbcidm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 26 09:42:18 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1812478715' entity='client.rgw.rgw.compute-2.fgzdbm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 26 09:42:18 compute-1 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-1.fbcidm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 26 09:42:18 compute-1 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-2.fgzdbm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 26 09:42:18 compute-1 ceph-mon[80107]: Active manager daemon compute-0.zllcia restarted
Jan 26 09:42:18 compute-1 ceph-mon[80107]: Activating manager daemon compute-0.zllcia
Jan 26 09:42:18 compute-1 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-1.fbcidm' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 26 09:42:18 compute-1 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-2.fgzdbm' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 26 09:42:18 compute-1 ceph-mon[80107]: osdmap e53: 3 total, 3 up, 3 in
Jan 26 09:42:18 compute-1 ceph-mon[80107]: mgrmap e14: compute-0.zllcia(active, starting, since 0.0391316s), standbys: compute-1.xammti, compute-2.oynaeu
Jan 26 09:42:18 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Jan 26 09:42:18 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 26 09:42:18 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 26 09:42:18 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mgr metadata", "who": "compute-0.zllcia", "id": "compute-0.zllcia"}]: dispatch
Jan 26 09:42:18 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1812478715' entity='client.rgw.rgw.compute-2.fgzdbm' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 26 09:42:18 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mgr metadata", "who": "compute-1.xammti", "id": "compute-1.xammti"}]: dispatch
Jan 26 09:42:18 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mgr metadata", "who": "compute-2.oynaeu", "id": "compute-2.oynaeu"}]: dispatch
Jan 26 09:42:18 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Jan 26 09:42:18 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Jan 26 09:42:18 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 26 09:42:18 compute-1 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-2.fgzdbm' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 26 09:42:18 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mds metadata"}]: dispatch
Jan 26 09:42:18 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 26 09:42:18 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mon metadata"}]: dispatch
Jan 26 09:42:18 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/992292627' entity='client.rgw.rgw.compute-1.fbcidm' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 26 09:42:18 compute-1 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-1.fbcidm' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 26 09:42:18 compute-1 ceph-mon[80107]: Manager daemon compute-0.zllcia is now available
Jan 26 09:42:18 compute-1 ceph-mon[80107]: 6.6 scrub starts
Jan 26 09:42:18 compute-1 ceph-mon[80107]: 6.6 scrub ok
Jan 26 09:42:18 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.zllcia/mirror_snapshot_schedule"}]: dispatch
Jan 26 09:42:18 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.zllcia/trash_purge_schedule"}]: dispatch
Jan 26 09:42:18 compute-1 ceph-mon[80107]: Standby manager daemon compute-1.xammti restarted
Jan 26 09:42:18 compute-1 ceph-mon[80107]: Standby manager daemon compute-1.xammti started
Jan 26 09:42:18 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e54 e54: 3 total, 3 up, 3 in
Jan 26 09:42:18 compute-1 radosgw[82065]: v1 topic migration: starting v1 topic migration..
Jan 26 09:42:18 compute-1 radosgw[82065]: LDAP not started since no server URIs were provided in the configuration.
Jan 26 09:42:18 compute-1 radosgw[82065]: v1 topic migration: finished v1 topic migration
Jan 26 09:42:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-rgw-rgw-compute-1-fbcidm[82061]: 2026-01-26T09:42:18.781+0000 7f18dcb5b980 -1 LDAP not started since no server URIs were provided in the configuration.
Jan 26 09:42:18 compute-1 radosgw[82065]: framework: beast
Jan 26 09:42:18 compute-1 radosgw[82065]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Jan 26 09:42:18 compute-1 radosgw[82065]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Jan 26 09:42:18 compute-1 radosgw[82065]: starting handler: beast
Jan 26 09:42:18 compute-1 podman[82802]: 2026-01-26 09:42:18.822111988 +0000 UTC m=+0.067907681 container exec 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid)
Jan 26 09:42:18 compute-1 radosgw[82065]: set uid:gid to 167:167 (ceph:ceph)
Jan 26 09:42:18 compute-1 radosgw[82065]: mgrc service_daemon_register rgw.24169 metadata {arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.101:8082,frontend_type#0=beast,hostname=compute-1,id=rgw.compute-1.fbcidm,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026,kernel_version=5.14.0-661.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864304,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=88adcf12-6dc3-48b6-86bb-ed23fd934e78,zone_name=default,zonegroup_id=423841e2-30ae-45d1-92b7-7a24aa3d4488,zonegroup_name=default}
Jan 26 09:42:18 compute-1 podman[82802]: 2026-01-26 09:42:18.917675954 +0000 UTC m=+0.163471667 container exec_died 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 09:42:19 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 4.1b scrub starts
Jan 26 09:42:19 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 4.1b scrub ok
Jan 26 09:42:19 compute-1 sudo[82705]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:19 compute-1 sudo[82939]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:42:19 compute-1 sudo[82939]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:19 compute-1 sudo[82939]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:19 compute-1 sudo[82964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 09:42:19 compute-1 sudo[82964]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:19 compute-1 ceph-mon[80107]: 7.1a scrub starts
Jan 26 09:42:19 compute-1 ceph-mon[80107]: 7.1a scrub ok
Jan 26 09:42:19 compute-1 ceph-mon[80107]: 7.1f scrub starts
Jan 26 09:42:19 compute-1 ceph-mon[80107]: 7.1f scrub ok
Jan 26 09:42:19 compute-1 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-2.fgzdbm' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 26 09:42:19 compute-1 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-1.fbcidm' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 26 09:42:19 compute-1 ceph-mon[80107]: osdmap e54: 3 total, 3 up, 3 in
Jan 26 09:42:19 compute-1 ceph-mon[80107]: 6.b scrub starts
Jan 26 09:42:19 compute-1 ceph-mon[80107]: 6.b scrub ok
Jan 26 09:42:19 compute-1 ceph-mon[80107]: mgrmap e15: compute-0.zllcia(active, since 1.0904s), standbys: compute-1.xammti, compute-2.oynaeu
Jan 26 09:42:19 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:19 compute-1 ceph-mon[80107]: [26/Jan/2026:09:42:19] ENGINE Bus STARTING
Jan 26 09:42:19 compute-1 ceph-mon[80107]: [26/Jan/2026:09:42:19] ENGINE Serving on https://192.168.122.100:7150
Jan 26 09:42:19 compute-1 ceph-mon[80107]: [26/Jan/2026:09:42:19] ENGINE Client ('192.168.122.100', 53286) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Jan 26 09:42:19 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:19 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:19 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:19 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:19 compute-1 ceph-mon[80107]: [26/Jan/2026:09:42:19] ENGINE Serving on http://192.168.122.100:8765
Jan 26 09:42:19 compute-1 ceph-mon[80107]: [26/Jan/2026:09:42:19] ENGINE Bus STARTED
Jan 26 09:42:19 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:19 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:19 compute-1 sudo[82964]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:20 compute-1 sudo[83020]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:42:20 compute-1 sudo[83020]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:20 compute-1 sudo[83020]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:20 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.7 scrub starts
Jan 26 09:42:20 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.7 scrub ok
Jan 26 09:42:20 compute-1 sudo[83045]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Jan 26 09:42:20 compute-1 sudo[83045]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:20 compute-1 sudo[83045]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:20 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e54 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:42:20 compute-1 sudo[83089]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Jan 26 09:42:20 compute-1 sudo[83089]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:20 compute-1 sudo[83089]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:20 compute-1 ceph-mon[80107]: 4.1b scrub starts
Jan 26 09:42:20 compute-1 ceph-mon[80107]: 4.1b scrub ok
Jan 26 09:42:20 compute-1 ceph-mon[80107]: 2.18 scrub starts
Jan 26 09:42:20 compute-1 ceph-mon[80107]: 2.18 scrub ok
Jan 26 09:42:20 compute-1 ceph-mon[80107]: pgmap v5: 197 pgs: 197 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:42:20 compute-1 ceph-mon[80107]: from='client.14397 -' entity='client.admin' cmd=[{"prefix": "dashboard set-grafana-api-password", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 09:42:20 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:20 compute-1 ceph-mon[80107]: 6.18 scrub starts
Jan 26 09:42:20 compute-1 ceph-mon[80107]: 6.18 scrub ok
Jan 26 09:42:20 compute-1 ceph-mon[80107]: mgrmap e16: compute-0.zllcia(active, since 2s), standbys: compute-1.xammti, compute-2.oynaeu
Jan 26 09:42:20 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:20 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:20 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:20 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 26 09:42:20 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:20 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 26 09:42:20 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Jan 26 09:42:20 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:20 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:20 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Jan 26 09:42:20 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:42:20 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 09:42:20 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:20 compute-1 sudo[83114]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph
Jan 26 09:42:20 compute-1 sudo[83114]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:20 compute-1 sudo[83114]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:20 compute-1 sudo[83139]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.conf.new
Jan 26 09:42:20 compute-1 sudo[83139]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:20 compute-1 sudo[83139]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:20 compute-1 sudo[83164]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:42:20 compute-1 sudo[83164]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:20 compute-1 sudo[83164]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:20 compute-1 sudo[83189]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.conf.new
Jan 26 09:42:20 compute-1 sudo[83189]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:20 compute-1 sudo[83189]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:20 compute-1 sudo[83237]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.conf.new
Jan 26 09:42:20 compute-1 sudo[83237]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:20 compute-1 sudo[83237]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:21 compute-1 sudo[83262]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.conf.new
Jan 26 09:42:21 compute-1 sudo[83262]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:21 compute-1 sudo[83262]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:21 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 4.18 scrub starts
Jan 26 09:42:21 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 4.18 scrub ok
Jan 26 09:42:21 compute-1 sudo[83287]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Jan 26 09:42:21 compute-1 sudo[83287]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:21 compute-1 sudo[83287]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:21 compute-1 sudo[83312]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config
Jan 26 09:42:21 compute-1 sudo[83312]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:21 compute-1 sudo[83312]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:21 compute-1 sudo[83337]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config
Jan 26 09:42:21 compute-1 sudo[83337]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:21 compute-1 sudo[83337]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:21 compute-1 sudo[83362]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf.new
Jan 26 09:42:21 compute-1 sudo[83362]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:21 compute-1 sudo[83362]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:21 compute-1 sudo[83387]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:42:21 compute-1 sudo[83387]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:21 compute-1 sudo[83387]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:21 compute-1 sudo[83412]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf.new
Jan 26 09:42:21 compute-1 sudo[83412]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:21 compute-1 sudo[83412]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:21 compute-1 sudo[83460]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf.new
Jan 26 09:42:21 compute-1 sudo[83460]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:21 compute-1 sudo[83460]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:21 compute-1 sudo[83485]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf.new
Jan 26 09:42:21 compute-1 sudo[83485]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:21 compute-1 sudo[83485]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:21 compute-1 sudo[83510]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf.new /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 09:42:21 compute-1 sudo[83510]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:21 compute-1 sudo[83510]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:21 compute-1 sudo[83535]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Jan 26 09:42:21 compute-1 sudo[83535]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:21 compute-1 sudo[83535]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:21 compute-1 ceph-mon[80107]: 5.7 scrub starts
Jan 26 09:42:21 compute-1 ceph-mon[80107]: 5.7 scrub ok
Jan 26 09:42:21 compute-1 ceph-mon[80107]: 6.12 scrub starts
Jan 26 09:42:21 compute-1 ceph-mon[80107]: 6.12 scrub ok
Jan 26 09:42:21 compute-1 ceph-mon[80107]: Adjusting osd_memory_target on compute-0 to 127.9M
Jan 26 09:42:21 compute-1 ceph-mon[80107]: Unable to set osd_memory_target on compute-0 to 134209126: error parsing value: Value '134209126' is below minimum 939524096
Jan 26 09:42:21 compute-1 ceph-mon[80107]: Updating compute-0:/etc/ceph/ceph.conf
Jan 26 09:42:21 compute-1 ceph-mon[80107]: Updating compute-1:/etc/ceph/ceph.conf
Jan 26 09:42:21 compute-1 ceph-mon[80107]: Updating compute-2:/etc/ceph/ceph.conf
Jan 26 09:42:21 compute-1 ceph-mon[80107]: from='client.14409 -' entity='client.admin' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://192.168.122.100:9093", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 09:42:21 compute-1 ceph-mon[80107]: 6.4 scrub starts
Jan 26 09:42:21 compute-1 ceph-mon[80107]: 6.4 scrub ok
Jan 26 09:42:21 compute-1 ceph-mon[80107]: Updating compute-1:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 09:42:21 compute-1 ceph-mon[80107]: Updating compute-2:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 09:42:21 compute-1 ceph-mon[80107]: Updating compute-0:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 09:42:21 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:21 compute-1 sudo[83560]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph
Jan 26 09:42:21 compute-1 sudo[83560]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:21 compute-1 sudo[83560]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:21 compute-1 sudo[83585]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.client.admin.keyring.new
Jan 26 09:42:21 compute-1 sudo[83585]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:21 compute-1 sudo[83585]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:21 compute-1 sudo[83610]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:42:21 compute-1 sudo[83610]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:21 compute-1 sudo[83610]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:21 compute-1 sudo[83635]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.client.admin.keyring.new
Jan 26 09:42:21 compute-1 sudo[83635]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:21 compute-1 sudo[83635]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:22 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.1a scrub starts
Jan 26 09:42:22 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.1a scrub ok
Jan 26 09:42:22 compute-1 sudo[83683]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.client.admin.keyring.new
Jan 26 09:42:22 compute-1 sudo[83683]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:22 compute-1 sudo[83683]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:22 compute-1 sudo[83708]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.client.admin.keyring.new
Jan 26 09:42:22 compute-1 sudo[83708]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:22 compute-1 sudo[83708]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:22 compute-1 sudo[83733]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Jan 26 09:42:22 compute-1 sudo[83733]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:22 compute-1 sudo[83733]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:22 compute-1 sudo[83758]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config
Jan 26 09:42:22 compute-1 sudo[83758]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:22 compute-1 sudo[83758]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:22 compute-1 sudo[83783]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config
Jan 26 09:42:22 compute-1 sudo[83783]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:22 compute-1 sudo[83783]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:22 compute-1 sudo[83808]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring.new
Jan 26 09:42:22 compute-1 sudo[83808]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:22 compute-1 sudo[83808]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:22 compute-1 sudo[83833]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:42:22 compute-1 sudo[83833]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:22 compute-1 sudo[83833]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:22 compute-1 sudo[83858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring.new
Jan 26 09:42:22 compute-1 sudo[83858]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:22 compute-1 sudo[83858]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:22 compute-1 sudo[83906]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring.new
Jan 26 09:42:22 compute-1 sudo[83906]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:22 compute-1 sudo[83906]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:22 compute-1 sudo[83931]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring.new
Jan 26 09:42:22 compute-1 sudo[83931]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:22 compute-1 sudo[83931]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:22 compute-1 ceph-mon[80107]: 4.18 scrub starts
Jan 26 09:42:22 compute-1 ceph-mon[80107]: 4.18 scrub ok
Jan 26 09:42:22 compute-1 ceph-mon[80107]: from='client.14415 -' entity='client.admin' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://192.168.122.100:9092", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 09:42:22 compute-1 ceph-mon[80107]: 7.11 scrub starts
Jan 26 09:42:22 compute-1 ceph-mon[80107]: 7.11 scrub ok
Jan 26 09:42:22 compute-1 ceph-mon[80107]: pgmap v6: 197 pgs: 197 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:42:22 compute-1 ceph-mon[80107]: 6.9 scrub starts
Jan 26 09:42:22 compute-1 ceph-mon[80107]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Jan 26 09:42:22 compute-1 ceph-mon[80107]: 6.9 scrub ok
Jan 26 09:42:22 compute-1 ceph-mon[80107]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Jan 26 09:42:22 compute-1 ceph-mon[80107]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 26 09:42:22 compute-1 ceph-mon[80107]: Updating compute-0:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring
Jan 26 09:42:22 compute-1 ceph-mon[80107]: Updating compute-2:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring
Jan 26 09:42:22 compute-1 ceph-mon[80107]: Updating compute-1:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring
Jan 26 09:42:22 compute-1 ceph-mon[80107]: from='client.24161 -' entity='client.admin' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "http://192.168.122.100:3100", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 09:42:22 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:22 compute-1 ceph-mon[80107]: mgrmap e17: compute-0.zllcia(active, since 4s), standbys: compute-1.xammti, compute-2.oynaeu
Jan 26 09:42:22 compute-1 sudo[83956]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring.new /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring
Jan 26 09:42:22 compute-1 sudo[83956]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:22 compute-1 sudo[83956]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:23 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.1c deep-scrub starts
Jan 26 09:42:23 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.1c deep-scrub ok
Jan 26 09:42:23 compute-1 ceph-mon[80107]: 6.1a scrub starts
Jan 26 09:42:23 compute-1 ceph-mon[80107]: 6.1a scrub ok
Jan 26 09:42:23 compute-1 ceph-mon[80107]: 2.15 scrub starts
Jan 26 09:42:23 compute-1 ceph-mon[80107]: 2.15 scrub ok
Jan 26 09:42:23 compute-1 ceph-mon[80107]: 6.1f deep-scrub starts
Jan 26 09:42:23 compute-1 ceph-mon[80107]: 6.1f deep-scrub ok
Jan 26 09:42:23 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:23 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:23 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:23 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:23 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:23 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:23 compute-1 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:23 compute-1 ceph-mon[80107]: Deploying daemon node-exporter.compute-0 on compute-0
Jan 26 09:42:23 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1657408057' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Jan 26 09:42:23 compute-1 ceph-mgr[80416]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 26 09:42:23 compute-1 ceph-mgr[80416]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 26 09:42:23 compute-1 ceph-mgr[80416]: mgr respawn  0: '/usr/bin/ceph-mgr'
Jan 26 09:42:23 compute-1 ceph-mgr[80416]: mgr respawn  1: '-n'
Jan 26 09:42:23 compute-1 ceph-mgr[80416]: mgr respawn  2: 'mgr.compute-1.xammti'
Jan 26 09:42:23 compute-1 ceph-mgr[80416]: mgr respawn  3: '-f'
Jan 26 09:42:23 compute-1 ceph-mgr[80416]: mgr respawn  4: '--setuser'
Jan 26 09:42:23 compute-1 ceph-mgr[80416]: mgr respawn  5: 'ceph'
Jan 26 09:42:23 compute-1 ceph-mgr[80416]: mgr respawn  6: '--setgroup'
Jan 26 09:42:23 compute-1 ceph-mgr[80416]: mgr respawn  7: 'ceph'
Jan 26 09:42:23 compute-1 ceph-mgr[80416]: mgr respawn  8: '--default-log-to-file=false'
Jan 26 09:42:23 compute-1 ceph-mgr[80416]: mgr respawn  9: '--default-log-to-journald=true'
Jan 26 09:42:23 compute-1 ceph-mgr[80416]: mgr respawn  10: '--default-log-to-stderr=false'
Jan 26 09:42:23 compute-1 ceph-mgr[80416]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Jan 26 09:42:23 compute-1 ceph-mgr[80416]: mgr respawn  exe_path /proc/self/exe
Jan 26 09:42:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: ignoring --setuser ceph since I am not root
Jan 26 09:42:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: ignoring --setgroup ceph since I am not root
Jan 26 09:42:23 compute-1 sshd-session[82667]: Connection closed by 192.168.122.100 port 42160
Jan 26 09:42:23 compute-1 sshd-session[82664]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 26 09:42:23 compute-1 ceph-mgr[80416]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 26 09:42:23 compute-1 ceph-mgr[80416]: pidfile_write: ignore empty --pid-file
Jan 26 09:42:23 compute-1 systemd[1]: session-33.scope: Deactivated successfully.
Jan 26 09:42:23 compute-1 systemd[1]: session-33.scope: Consumed 4.544s CPU time.
Jan 26 09:42:23 compute-1 systemd-logind[783]: Session 33 logged out. Waiting for processes to exit.
Jan 26 09:42:23 compute-1 systemd-logind[783]: Removed session 33.
Jan 26 09:42:23 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'alerts'
Jan 26 09:42:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:24.086+0000 7f58e812f140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 26 09:42:24 compute-1 ceph-mgr[80416]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 26 09:42:24 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'balancer'
Jan 26 09:42:24 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 4.e scrub starts
Jan 26 09:42:24 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 4.e scrub ok
Jan 26 09:42:24 compute-1 ceph-mgr[80416]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 26 09:42:24 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'cephadm'
Jan 26 09:42:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:24.169+0000 7f58e812f140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 26 09:42:24 compute-1 ceph-mon[80107]: 5.1c deep-scrub starts
Jan 26 09:42:24 compute-1 ceph-mon[80107]: 5.1c deep-scrub ok
Jan 26 09:42:24 compute-1 ceph-mon[80107]: 7.1d deep-scrub starts
Jan 26 09:42:24 compute-1 ceph-mon[80107]: 7.1d deep-scrub ok
Jan 26 09:42:24 compute-1 ceph-mon[80107]: 4.1e scrub starts
Jan 26 09:42:24 compute-1 ceph-mon[80107]: 4.1e scrub ok
Jan 26 09:42:24 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1657408057' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Jan 26 09:42:24 compute-1 ceph-mon[80107]: mgrmap e18: compute-0.zllcia(active, since 6s), standbys: compute-1.xammti, compute-2.oynaeu
Jan 26 09:42:24 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/974467440' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Jan 26 09:42:24 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'crash'
Jan 26 09:42:25 compute-1 ceph-mgr[80416]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 26 09:42:25 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'dashboard'
Jan 26 09:42:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:25.039+0000 7f58e812f140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 26 09:42:25 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.e scrub starts
Jan 26 09:42:25 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.e scrub ok
Jan 26 09:42:25 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e54 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:42:25 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'devicehealth'
Jan 26 09:42:25 compute-1 ceph-mgr[80416]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 26 09:42:25 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'diskprediction_local'
Jan 26 09:42:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:25.703+0000 7f58e812f140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 26 09:42:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 26 09:42:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 26 09:42:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]:   from numpy import show_config as show_numpy_config
Jan 26 09:42:25 compute-1 ceph-mon[80107]: 4.e scrub starts
Jan 26 09:42:25 compute-1 ceph-mon[80107]: 4.e scrub ok
Jan 26 09:42:25 compute-1 ceph-mon[80107]: 4.14 scrub starts
Jan 26 09:42:25 compute-1 ceph-mon[80107]: 4.14 scrub ok
Jan 26 09:42:25 compute-1 ceph-mon[80107]: 6.1d scrub starts
Jan 26 09:42:25 compute-1 ceph-mon[80107]: 6.1d scrub ok
Jan 26 09:42:25 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/974467440' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Jan 26 09:42:25 compute-1 ceph-mon[80107]: mgrmap e19: compute-0.zllcia(active, since 7s), standbys: compute-1.xammti, compute-2.oynaeu
Jan 26 09:42:25 compute-1 ceph-mgr[80416]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 26 09:42:25 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'influx'
Jan 26 09:42:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:25.913+0000 7f58e812f140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 26 09:42:26 compute-1 ceph-mgr[80416]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 26 09:42:26 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'insights'
Jan 26 09:42:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:26.009+0000 7f58e812f140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 26 09:42:26 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.d deep-scrub starts
Jan 26 09:42:26 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.d deep-scrub ok
Jan 26 09:42:26 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'iostat'
Jan 26 09:42:26 compute-1 ceph-mgr[80416]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 26 09:42:26 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'k8sevents'
Jan 26 09:42:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:26.193+0000 7f58e812f140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 26 09:42:26 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'localpool'
Jan 26 09:42:26 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'mds_autoscaler'
Jan 26 09:42:26 compute-1 ceph-mon[80107]: 6.e scrub starts
Jan 26 09:42:26 compute-1 ceph-mon[80107]: 6.e scrub ok
Jan 26 09:42:26 compute-1 ceph-mon[80107]: 2.12 scrub starts
Jan 26 09:42:26 compute-1 ceph-mon[80107]: 2.12 scrub ok
Jan 26 09:42:26 compute-1 ceph-mon[80107]: 5.19 deep-scrub starts
Jan 26 09:42:26 compute-1 ceph-mon[80107]: 5.19 deep-scrub ok
Jan 26 09:42:26 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'mirroring'
Jan 26 09:42:27 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'nfs'
Jan 26 09:42:27 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.3 scrub starts
Jan 26 09:42:27 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.3 scrub ok
Jan 26 09:42:27 compute-1 ceph-mgr[80416]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 26 09:42:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:27.326+0000 7f58e812f140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 26 09:42:27 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'orchestrator'
Jan 26 09:42:27 compute-1 ceph-mgr[80416]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 26 09:42:27 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'osd_perf_query'
Jan 26 09:42:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:27.591+0000 7f58e812f140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 26 09:42:27 compute-1 ceph-mgr[80416]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 26 09:42:27 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'osd_support'
Jan 26 09:42:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:27.694+0000 7f58e812f140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 26 09:42:27 compute-1 ceph-mgr[80416]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 26 09:42:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:27.779+0000 7f58e812f140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 26 09:42:27 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'pg_autoscaler'
Jan 26 09:42:27 compute-1 ceph-mgr[80416]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 26 09:42:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:27.875+0000 7f58e812f140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 26 09:42:27 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'progress'
Jan 26 09:42:27 compute-1 ceph-mon[80107]: 6.d deep-scrub starts
Jan 26 09:42:27 compute-1 ceph-mon[80107]: 6.d deep-scrub ok
Jan 26 09:42:27 compute-1 ceph-mon[80107]: 6.17 scrub starts
Jan 26 09:42:27 compute-1 ceph-mon[80107]: 6.17 scrub ok
Jan 26 09:42:27 compute-1 ceph-mon[80107]: 5.17 scrub starts
Jan 26 09:42:27 compute-1 ceph-mon[80107]: 5.17 scrub ok
Jan 26 09:42:27 compute-1 ceph-mgr[80416]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 26 09:42:27 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'prometheus'
Jan 26 09:42:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:27.964+0000 7f58e812f140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 26 09:42:28 compute-1 sshd-session[84013]: Invalid user admin from 104.248.198.71 port 57118
Jan 26 09:42:28 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.7 scrub starts
Jan 26 09:42:28 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.7 scrub ok
Jan 26 09:42:28 compute-1 sshd-session[84013]: Connection closed by invalid user admin 104.248.198.71 port 57118 [preauth]
Jan 26 09:42:28 compute-1 ceph-mgr[80416]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 26 09:42:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:28.339+0000 7f58e812f140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 26 09:42:28 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'rbd_support'
Jan 26 09:42:28 compute-1 ceph-mgr[80416]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 26 09:42:28 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'restful'
Jan 26 09:42:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:28.442+0000 7f58e812f140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 26 09:42:28 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'rgw'
Jan 26 09:42:28 compute-1 ceph-mgr[80416]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 26 09:42:28 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'rook'
Jan 26 09:42:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:28.886+0000 7f58e812f140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 26 09:42:28 compute-1 ceph-mon[80107]: 6.3 scrub starts
Jan 26 09:42:28 compute-1 ceph-mon[80107]: 6.3 scrub ok
Jan 26 09:42:28 compute-1 ceph-mon[80107]: 3.12 scrub starts
Jan 26 09:42:28 compute-1 ceph-mon[80107]: 3.12 scrub ok
Jan 26 09:42:28 compute-1 ceph-mon[80107]: 7.16 scrub starts
Jan 26 09:42:28 compute-1 ceph-mon[80107]: 7.16 scrub ok
Jan 26 09:42:29 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.2 deep-scrub starts
Jan 26 09:42:29 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.2 deep-scrub ok
Jan 26 09:42:29 compute-1 ceph-mgr[80416]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 26 09:42:29 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'selftest'
Jan 26 09:42:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:29.495+0000 7f58e812f140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 26 09:42:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:29.577+0000 7f58e812f140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 26 09:42:29 compute-1 ceph-mgr[80416]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 26 09:42:29 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'snap_schedule'
Jan 26 09:42:29 compute-1 ceph-mgr[80416]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 26 09:42:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:29.672+0000 7f58e812f140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 26 09:42:29 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'stats'
Jan 26 09:42:29 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'status'
Jan 26 09:42:29 compute-1 ceph-mgr[80416]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 26 09:42:29 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'telegraf'
Jan 26 09:42:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:29.832+0000 7f58e812f140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 26 09:42:29 compute-1 ceph-mgr[80416]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 26 09:42:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:29.904+0000 7f58e812f140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 26 09:42:29 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'telemetry'
Jan 26 09:42:29 compute-1 ceph-mon[80107]: 6.7 scrub starts
Jan 26 09:42:29 compute-1 ceph-mon[80107]: 6.7 scrub ok
Jan 26 09:42:29 compute-1 ceph-mon[80107]: 2.10 scrub starts
Jan 26 09:42:29 compute-1 ceph-mon[80107]: 2.10 scrub ok
Jan 26 09:42:29 compute-1 ceph-mon[80107]: 7.1b scrub starts
Jan 26 09:42:29 compute-1 ceph-mon[80107]: 7.1b scrub ok
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 26 09:42:30 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:30.065+0000 7f58e812f140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'test_orchestrator'
Jan 26 09:42:30 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.8 scrub starts
Jan 26 09:42:30 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.8 scrub ok
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 26 09:42:30 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:30.306+0000 7f58e812f140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'volumes'
Jan 26 09:42:30 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e55 e55: 3 total, 3 up, 3 in
Jan 26 09:42:30 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e55 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'zabbix'
Jan 26 09:42:30 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:30.608+0000 7f58e812f140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 26 09:42:30 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:30.678+0000 7f58e812f140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: ms_deliver_dispatch: unhandled message 0x559dd0457860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: mgr respawn  0: '/usr/bin/ceph-mgr'
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: mgr respawn  1: '-n'
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: mgr respawn  2: 'mgr.compute-1.xammti'
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: mgr respawn  3: '-f'
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: mgr respawn  4: '--setuser'
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: mgr respawn  5: 'ceph'
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: mgr respawn  6: '--setgroup'
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: mgr respawn  7: 'ceph'
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: mgr respawn  8: '--default-log-to-file=false'
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: mgr respawn  9: '--default-log-to-journald=true'
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: mgr respawn  10: '--default-log-to-stderr=false'
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: mgr respawn  exe_path /proc/self/exe
Jan 26 09:42:30 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: ignoring --setuser ceph since I am not root
Jan 26 09:42:30 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: ignoring --setgroup ceph since I am not root
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: pidfile_write: ignore empty --pid-file
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'alerts'
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 26 09:42:30 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:30.917+0000 7f6e378db140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 26 09:42:30 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'balancer'
Jan 26 09:42:30 compute-1 ceph-mon[80107]: 6.2 deep-scrub starts
Jan 26 09:42:30 compute-1 ceph-mon[80107]: 6.2 deep-scrub ok
Jan 26 09:42:30 compute-1 ceph-mon[80107]: 2.13 scrub starts
Jan 26 09:42:30 compute-1 ceph-mon[80107]: 2.13 scrub ok
Jan 26 09:42:30 compute-1 ceph-mon[80107]: 5.14 scrub starts
Jan 26 09:42:30 compute-1 ceph-mon[80107]: 5.14 scrub ok
Jan 26 09:42:30 compute-1 ceph-mon[80107]: Active manager daemon compute-0.zllcia restarted
Jan 26 09:42:30 compute-1 ceph-mon[80107]: Activating manager daemon compute-0.zllcia
Jan 26 09:42:30 compute-1 ceph-mon[80107]: osdmap e55: 3 total, 3 up, 3 in
Jan 26 09:42:30 compute-1 ceph-mon[80107]: mgrmap e20: compute-0.zllcia(active, starting, since 0.0272396s), standbys: compute-1.xammti, compute-2.oynaeu
Jan 26 09:42:30 compute-1 ceph-mon[80107]: Standby manager daemon compute-2.oynaeu restarted
Jan 26 09:42:30 compute-1 ceph-mon[80107]: Standby manager daemon compute-2.oynaeu started
Jan 26 09:42:30 compute-1 ceph-mon[80107]: Standby manager daemon compute-1.xammti restarted
Jan 26 09:42:30 compute-1 ceph-mon[80107]: Standby manager daemon compute-1.xammti started
Jan 26 09:42:31 compute-1 ceph-mgr[80416]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 26 09:42:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:31.023+0000 7f6e378db140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 26 09:42:31 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'cephadm'
Jan 26 09:42:31 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.d scrub starts
Jan 26 09:42:31 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.d scrub ok
Jan 26 09:42:31 compute-1 sshd-session[84015]: Connection closed by authenticating user root 178.62.249.31 port 56484 [preauth]
Jan 26 09:42:31 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'crash'
Jan 26 09:42:31 compute-1 ceph-mgr[80416]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 26 09:42:31 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'dashboard'
Jan 26 09:42:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:31.896+0000 7f6e378db140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 26 09:42:31 compute-1 ceph-mon[80107]: 6.8 scrub starts
Jan 26 09:42:31 compute-1 ceph-mon[80107]: 6.8 scrub ok
Jan 26 09:42:31 compute-1 ceph-mon[80107]: 6.1c scrub starts
Jan 26 09:42:31 compute-1 ceph-mon[80107]: 6.1c scrub ok
Jan 26 09:42:31 compute-1 ceph-mon[80107]: 7.10 scrub starts
Jan 26 09:42:31 compute-1 ceph-mon[80107]: 7.10 scrub ok
Jan 26 09:42:31 compute-1 ceph-mon[80107]: mgrmap e21: compute-0.zllcia(active, starting, since 1.05323s), standbys: compute-1.xammti, compute-2.oynaeu
Jan 26 09:42:32 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 4.c deep-scrub starts
Jan 26 09:42:32 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 4.c deep-scrub ok
Jan 26 09:42:32 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'devicehealth'
Jan 26 09:42:32 compute-1 ceph-mgr[80416]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 26 09:42:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:32.567+0000 7f6e378db140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 26 09:42:32 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'diskprediction_local'
Jan 26 09:42:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 26 09:42:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 26 09:42:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]:   from numpy import show_config as show_numpy_config
Jan 26 09:42:32 compute-1 ceph-mgr[80416]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 26 09:42:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:32.743+0000 7f6e378db140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 26 09:42:32 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'influx'
Jan 26 09:42:32 compute-1 ceph-mgr[80416]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 26 09:42:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:32.816+0000 7f6e378db140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 26 09:42:32 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'insights'
Jan 26 09:42:32 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'iostat'
Jan 26 09:42:32 compute-1 ceph-mgr[80416]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 26 09:42:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:32.962+0000 7f6e378db140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 26 09:42:32 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'k8sevents'
Jan 26 09:42:32 compute-1 ceph-mon[80107]: 3.d scrub starts
Jan 26 09:42:32 compute-1 ceph-mon[80107]: 3.d scrub ok
Jan 26 09:42:32 compute-1 ceph-mon[80107]: 4.8 scrub starts
Jan 26 09:42:32 compute-1 ceph-mon[80107]: 4.8 scrub ok
Jan 26 09:42:32 compute-1 ceph-mon[80107]: 3.17 deep-scrub starts
Jan 26 09:42:32 compute-1 ceph-mon[80107]: 3.17 deep-scrub ok
Jan 26 09:42:33 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.19 scrub starts
Jan 26 09:42:33 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.19 scrub ok
Jan 26 09:42:33 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'localpool'
Jan 26 09:42:33 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'mds_autoscaler'
Jan 26 09:42:33 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'mirroring'
Jan 26 09:42:33 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'nfs'
Jan 26 09:42:33 compute-1 ceph-mon[80107]: 4.c deep-scrub starts
Jan 26 09:42:33 compute-1 ceph-mon[80107]: 4.c deep-scrub ok
Jan 26 09:42:33 compute-1 ceph-mon[80107]: 7.14 deep-scrub starts
Jan 26 09:42:33 compute-1 ceph-mon[80107]: 7.14 deep-scrub ok
Jan 26 09:42:33 compute-1 ceph-mon[80107]: 7.13 scrub starts
Jan 26 09:42:33 compute-1 ceph-mon[80107]: 7.13 scrub ok
Jan 26 09:42:33 compute-1 ceph-mgr[80416]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 26 09:42:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:33.997+0000 7f6e378db140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 26 09:42:33 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'orchestrator'
Jan 26 09:42:34 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.f deep-scrub starts
Jan 26 09:42:34 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.f deep-scrub ok
Jan 26 09:42:34 compute-1 systemd[1]: Stopping User Manager for UID 42477...
Jan 26 09:42:34 compute-1 systemd[72713]: Activating special unit Exit the Session...
Jan 26 09:42:34 compute-1 systemd[72713]: Stopped target Main User Target.
Jan 26 09:42:34 compute-1 systemd[72713]: Stopped target Basic System.
Jan 26 09:42:34 compute-1 systemd[72713]: Stopped target Paths.
Jan 26 09:42:34 compute-1 systemd[72713]: Stopped target Sockets.
Jan 26 09:42:34 compute-1 systemd[72713]: Stopped target Timers.
Jan 26 09:42:34 compute-1 systemd[72713]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 26 09:42:34 compute-1 systemd[72713]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 26 09:42:34 compute-1 systemd[72713]: Closed D-Bus User Message Bus Socket.
Jan 26 09:42:34 compute-1 systemd[72713]: Stopped Create User's Volatile Files and Directories.
Jan 26 09:42:34 compute-1 systemd[72713]: Removed slice User Application Slice.
Jan 26 09:42:34 compute-1 systemd[72713]: Reached target Shutdown.
Jan 26 09:42:34 compute-1 systemd[72713]: Finished Exit the Session.
Jan 26 09:42:34 compute-1 systemd[72713]: Reached target Exit the Session.
Jan 26 09:42:34 compute-1 systemd[1]: user@42477.service: Deactivated successfully.
Jan 26 09:42:34 compute-1 systemd[1]: Stopped User Manager for UID 42477.
Jan 26 09:42:34 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42477...
Jan 26 09:42:34 compute-1 ceph-mgr[80416]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 26 09:42:34 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'osd_perf_query'
Jan 26 09:42:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:34.229+0000 7f6e378db140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 26 09:42:34 compute-1 systemd[1]: run-user-42477.mount: Deactivated successfully.
Jan 26 09:42:34 compute-1 systemd[1]: user-runtime-dir@42477.service: Deactivated successfully.
Jan 26 09:42:34 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42477.
Jan 26 09:42:34 compute-1 systemd[1]: Removed slice User Slice of UID 42477.
Jan 26 09:42:34 compute-1 systemd[1]: user-42477.slice: Consumed 1min 43.767s CPU time.
Jan 26 09:42:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:34.314+0000 7f6e378db140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 26 09:42:34 compute-1 ceph-mgr[80416]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 26 09:42:34 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'osd_support'
Jan 26 09:42:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:34.381+0000 7f6e378db140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 26 09:42:34 compute-1 ceph-mgr[80416]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 26 09:42:34 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'pg_autoscaler'
Jan 26 09:42:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:34.458+0000 7f6e378db140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 26 09:42:34 compute-1 ceph-mgr[80416]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 26 09:42:34 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'progress'
Jan 26 09:42:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:34.529+0000 7f6e378db140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 26 09:42:34 compute-1 ceph-mgr[80416]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 26 09:42:34 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'prometheus'
Jan 26 09:42:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:34.895+0000 7f6e378db140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 26 09:42:34 compute-1 ceph-mgr[80416]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 26 09:42:34 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'rbd_support'
Jan 26 09:42:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:35.003+0000 7f6e378db140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 26 09:42:35 compute-1 ceph-mgr[80416]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 26 09:42:35 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'restful'
Jan 26 09:42:35 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 4.d scrub starts
Jan 26 09:42:35 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 4.d scrub ok
Jan 26 09:42:35 compute-1 ceph-mon[80107]: 6.19 scrub starts
Jan 26 09:42:35 compute-1 ceph-mon[80107]: 6.19 scrub ok
Jan 26 09:42:35 compute-1 ceph-mon[80107]: 7.a scrub starts
Jan 26 09:42:35 compute-1 ceph-mon[80107]: 7.a scrub ok
Jan 26 09:42:35 compute-1 ceph-mon[80107]: 5.c deep-scrub starts
Jan 26 09:42:35 compute-1 ceph-mon[80107]: 5.c deep-scrub ok
Jan 26 09:42:35 compute-1 ceph-mon[80107]: 3.f deep-scrub starts
Jan 26 09:42:35 compute-1 ceph-mon[80107]: 3.f deep-scrub ok
Jan 26 09:42:35 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'rgw'
Jan 26 09:42:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:35.504+0000 7f6e378db140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 26 09:42:35 compute-1 ceph-mgr[80416]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 26 09:42:35 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'rook'
Jan 26 09:42:35 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e55 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:42:36 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.9 deep-scrub starts
Jan 26 09:42:36 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.9 deep-scrub ok
Jan 26 09:42:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:36.103+0000 7f6e378db140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 26 09:42:36 compute-1 ceph-mgr[80416]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 26 09:42:36 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'selftest'
Jan 26 09:42:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:36.175+0000 7f6e378db140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 26 09:42:36 compute-1 ceph-mgr[80416]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 26 09:42:36 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'snap_schedule'
Jan 26 09:42:36 compute-1 ceph-mon[80107]: 2.d scrub starts
Jan 26 09:42:36 compute-1 ceph-mon[80107]: 2.d scrub ok
Jan 26 09:42:36 compute-1 ceph-mon[80107]: 7.e deep-scrub starts
Jan 26 09:42:36 compute-1 ceph-mon[80107]: 7.e deep-scrub ok
Jan 26 09:42:36 compute-1 ceph-mon[80107]: 4.d scrub starts
Jan 26 09:42:36 compute-1 ceph-mon[80107]: 4.d scrub ok
Jan 26 09:42:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:36.262+0000 7f6e378db140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 26 09:42:36 compute-1 ceph-mgr[80416]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 26 09:42:36 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'stats'
Jan 26 09:42:36 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'status'
Jan 26 09:42:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:36.415+0000 7f6e378db140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 26 09:42:36 compute-1 ceph-mgr[80416]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 26 09:42:36 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'telegraf'
Jan 26 09:42:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:36.494+0000 7f6e378db140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 26 09:42:36 compute-1 ceph-mgr[80416]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 26 09:42:36 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'telemetry'
Jan 26 09:42:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:36.646+0000 7f6e378db140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 26 09:42:36 compute-1 ceph-mgr[80416]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 26 09:42:36 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'test_orchestrator'
Jan 26 09:42:36 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e56 e56: 3 total, 3 up, 3 in
Jan 26 09:42:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:36.877+0000 7f6e378db140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 26 09:42:36 compute-1 ceph-mgr[80416]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 26 09:42:36 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'volumes'
Jan 26 09:42:37 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.10 deep-scrub starts
Jan 26 09:42:37 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.10 deep-scrub ok
Jan 26 09:42:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:37.140+0000 7f6e378db140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 26 09:42:37 compute-1 ceph-mgr[80416]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 26 09:42:37 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'zabbix'
Jan 26 09:42:37 compute-1 ceph-mon[80107]: 4.15 scrub starts
Jan 26 09:42:37 compute-1 ceph-mon[80107]: 4.15 scrub ok
Jan 26 09:42:37 compute-1 ceph-mon[80107]: 7.f scrub starts
Jan 26 09:42:37 compute-1 ceph-mon[80107]: 7.f scrub ok
Jan 26 09:42:37 compute-1 ceph-mon[80107]: 5.9 deep-scrub starts
Jan 26 09:42:37 compute-1 ceph-mon[80107]: 5.9 deep-scrub ok
Jan 26 09:42:37 compute-1 ceph-mon[80107]: Active manager daemon compute-0.zllcia restarted
Jan 26 09:42:37 compute-1 ceph-mon[80107]: Activating manager daemon compute-0.zllcia
Jan 26 09:42:37 compute-1 ceph-mon[80107]: osdmap e56: 3 total, 3 up, 3 in
Jan 26 09:42:37 compute-1 ceph-mon[80107]: mgrmap e22: compute-0.zllcia(active, starting, since 0.0311037s), standbys: compute-1.xammti, compute-2.oynaeu
Jan 26 09:42:37 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Jan 26 09:42:37 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 26 09:42:37 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 26 09:42:37 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mgr metadata", "who": "compute-0.zllcia", "id": "compute-0.zllcia"}]: dispatch
Jan 26 09:42:37 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mgr metadata", "who": "compute-1.xammti", "id": "compute-1.xammti"}]: dispatch
Jan 26 09:42:37 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mgr metadata", "who": "compute-2.oynaeu", "id": "compute-2.oynaeu"}]: dispatch
Jan 26 09:42:37 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Jan 26 09:42:37 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Jan 26 09:42:37 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 26 09:42:37 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mds metadata"}]: dispatch
Jan 26 09:42:37 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 26 09:42:37 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mon metadata"}]: dispatch
Jan 26 09:42:37 compute-1 ceph-mon[80107]: Manager daemon compute-0.zllcia is now available
Jan 26 09:42:37 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.zllcia/mirror_snapshot_schedule"}]: dispatch
Jan 26 09:42:37 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.zllcia/trash_purge_schedule"}]: dispatch
Jan 26 09:42:37 compute-1 ceph-mon[80107]: Standby manager daemon compute-2.oynaeu restarted
Jan 26 09:42:37 compute-1 ceph-mon[80107]: Standby manager daemon compute-2.oynaeu started
Jan 26 09:42:37 compute-1 sshd-session[84050]: Accepted publickey for ceph-admin from 192.168.122.100 port 49156 ssh2: RSA SHA256:cGz1g5qmzBfeiAiDRElnaAonZh1cdMIZMAXyGkEzbws
Jan 26 09:42:37 compute-1 systemd-logind[783]: New session 34 of user ceph-admin.
Jan 26 09:42:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:37.206+0000 7f6e378db140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 26 09:42:37 compute-1 ceph-mgr[80416]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 26 09:42:37 compute-1 systemd[1]: Created slice User Slice of UID 42477.
Jan 26 09:42:37 compute-1 ceph-mgr[80416]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 26 09:42:37 compute-1 ceph-mgr[80416]: mgr load Constructed class from module: dashboard
Jan 26 09:42:37 compute-1 ceph-mgr[80416]: [dashboard INFO root] server: ssl=no host=192.168.122.101 port=8443
Jan 26 09:42:37 compute-1 ceph-mgr[80416]: ms_deliver_dispatch: unhandled message 0x557726af9860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Jan 26 09:42:37 compute-1 ceph-mgr[80416]: [dashboard INFO root] Configured CherryPy, starting engine...
Jan 26 09:42:37 compute-1 ceph-mgr[80416]: [dashboard INFO root] Starting engine...
Jan 26 09:42:37 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42477...
Jan 26 09:42:37 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42477.
Jan 26 09:42:37 compute-1 systemd[1]: Starting User Manager for UID 42477...
Jan 26 09:42:37 compute-1 systemd[84066]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 26 09:42:37 compute-1 ceph-mgr[80416]: [dashboard INFO root] Engine started...
Jan 26 09:42:37 compute-1 systemd[84066]: Queued start job for default target Main User Target.
Jan 26 09:42:37 compute-1 systemd[84066]: Created slice User Application Slice.
Jan 26 09:42:37 compute-1 systemd[84066]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 26 09:42:37 compute-1 systemd[84066]: Started Daily Cleanup of User's Temporary Directories.
Jan 26 09:42:37 compute-1 systemd[84066]: Reached target Paths.
Jan 26 09:42:37 compute-1 systemd[84066]: Reached target Timers.
Jan 26 09:42:37 compute-1 systemd[84066]: Starting D-Bus User Message Bus Socket...
Jan 26 09:42:37 compute-1 systemd[84066]: Starting Create User's Volatile Files and Directories...
Jan 26 09:42:37 compute-1 systemd[84066]: Listening on D-Bus User Message Bus Socket.
Jan 26 09:42:37 compute-1 systemd[84066]: Reached target Sockets.
Jan 26 09:42:37 compute-1 systemd[84066]: Finished Create User's Volatile Files and Directories.
Jan 26 09:42:37 compute-1 systemd[84066]: Reached target Basic System.
Jan 26 09:42:37 compute-1 systemd[1]: Started User Manager for UID 42477.
Jan 26 09:42:37 compute-1 systemd[84066]: Reached target Main User Target.
Jan 26 09:42:37 compute-1 systemd[84066]: Startup finished in 125ms.
Jan 26 09:42:37 compute-1 systemd[1]: Started Session 34 of User ceph-admin.
Jan 26 09:42:37 compute-1 sshd-session[84050]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 26 09:42:37 compute-1 sudo[84083]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:42:37 compute-1 sudo[84083]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:37 compute-1 sudo[84083]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:37 compute-1 sudo[84108]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 26 09:42:37 compute-1 sudo[84108]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).mds e2 new map
Jan 26 09:42:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).mds e2 print_map
                                           e2
                                           btime 2026-01-26T09:42:37:723366+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        2
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-01-26T09:42:37.723319+0000
                                           modified        2026-01-26T09:42:37.723319+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        
                                           up        {}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                            
                                            
Jan 26 09:42:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e57 e57: 3 total, 3 up, 3 in
Jan 26 09:42:38 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.c scrub starts
Jan 26 09:42:38 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.c scrub ok
Jan 26 09:42:38 compute-1 podman[84204]: 2026-01-26 09:42:38.14791446 +0000 UTC m=+0.055282932 container exec 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:42:38 compute-1 ceph-mon[80107]: 2.f scrub starts
Jan 26 09:42:38 compute-1 ceph-mon[80107]: 2.f scrub ok
Jan 26 09:42:38 compute-1 ceph-mon[80107]: 3.7 deep-scrub starts
Jan 26 09:42:38 compute-1 ceph-mon[80107]: 3.7 deep-scrub ok
Jan 26 09:42:38 compute-1 ceph-mon[80107]: 3.10 deep-scrub starts
Jan 26 09:42:38 compute-1 ceph-mon[80107]: 3.10 deep-scrub ok
Jan 26 09:42:38 compute-1 ceph-mon[80107]: Standby manager daemon compute-1.xammti restarted
Jan 26 09:42:38 compute-1 ceph-mon[80107]: Standby manager daemon compute-1.xammti started
Jan 26 09:42:38 compute-1 ceph-mon[80107]: mgrmap e23: compute-0.zllcia(active, since 1.05181s), standbys: compute-1.xammti, compute-2.oynaeu
Jan 26 09:42:38 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Jan 26 09:42:38 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Jan 26 09:42:38 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Jan 26 09:42:38 compute-1 ceph-mon[80107]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Jan 26 09:42:38 compute-1 ceph-mon[80107]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Jan 26 09:42:38 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Jan 26 09:42:38 compute-1 ceph-mon[80107]: osdmap e57: 3 total, 3 up, 3 in
Jan 26 09:42:38 compute-1 ceph-mon[80107]: fsmap cephfs:0
Jan 26 09:42:38 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:38 compute-1 podman[84204]: 2026-01-26 09:42:38.238436257 +0000 UTC m=+0.145804749 container exec_died 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, ceph=True, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 09:42:38 compute-1 sudo[84108]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:38 compute-1 sudo[84309]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:42:38 compute-1 sudo[84309]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:38 compute-1 sudo[84309]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:38 compute-1 sudo[84334]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 09:42:38 compute-1 sudo[84334]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:39 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.15 scrub starts
Jan 26 09:42:39 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.15 scrub ok
Jan 26 09:42:39 compute-1 sudo[84334]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:39 compute-1 ceph-mon[80107]: 7.5 scrub starts
Jan 26 09:42:39 compute-1 ceph-mon[80107]: 7.5 scrub ok
Jan 26 09:42:39 compute-1 ceph-mon[80107]: 7.3 scrub starts
Jan 26 09:42:39 compute-1 ceph-mon[80107]: 7.3 scrub ok
Jan 26 09:42:39 compute-1 ceph-mon[80107]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 26 09:42:39 compute-1 ceph-mon[80107]: [26/Jan/2026:09:42:37] ENGINE Bus STARTING
Jan 26 09:42:39 compute-1 ceph-mon[80107]: [26/Jan/2026:09:42:38] ENGINE Serving on https://192.168.122.100:7150
Jan 26 09:42:39 compute-1 ceph-mon[80107]: [26/Jan/2026:09:42:38] ENGINE Client ('192.168.122.100', 37614) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Jan 26 09:42:39 compute-1 ceph-mon[80107]: 3.c scrub starts
Jan 26 09:42:39 compute-1 ceph-mon[80107]: 3.c scrub ok
Jan 26 09:42:39 compute-1 ceph-mon[80107]: [26/Jan/2026:09:42:38] ENGINE Serving on http://192.168.122.100:8765
Jan 26 09:42:39 compute-1 ceph-mon[80107]: [26/Jan/2026:09:42:38] ENGINE Bus STARTED
Jan 26 09:42:39 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:39 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:39 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:39 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:39 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:39 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:39 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:39 compute-1 sudo[84389]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:42:39 compute-1 sudo[84389]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:39 compute-1 sudo[84389]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:39 compute-1 sudo[84414]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Jan 26 09:42:39 compute-1 sudo[84414]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:39 compute-1 sudo[84414]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:40 compute-1 sudo[84457]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Jan 26 09:42:40 compute-1 sudo[84457]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:40 compute-1 sudo[84457]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:40 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.14 scrub starts
Jan 26 09:42:40 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.14 scrub ok
Jan 26 09:42:40 compute-1 sudo[84482]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph
Jan 26 09:42:40 compute-1 sudo[84482]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:40 compute-1 sudo[84482]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:40 compute-1 sudo[84507]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.conf.new
Jan 26 09:42:40 compute-1 sudo[84507]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:40 compute-1 sudo[84507]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:40 compute-1 sudo[84532]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:42:40 compute-1 sudo[84532]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:40 compute-1 sudo[84532]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:40 compute-1 sudo[84557]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.conf.new
Jan 26 09:42:40 compute-1 sudo[84557]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:40 compute-1 sudo[84557]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:40 compute-1 sudo[84605]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.conf.new
Jan 26 09:42:40 compute-1 sudo[84605]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:40 compute-1 ceph-mon[80107]: 4.1 scrub starts
Jan 26 09:42:40 compute-1 ceph-mon[80107]: 4.1 scrub ok
Jan 26 09:42:40 compute-1 ceph-mon[80107]: 5.3 scrub starts
Jan 26 09:42:40 compute-1 ceph-mon[80107]: 5.3 scrub ok
Jan 26 09:42:40 compute-1 ceph-mon[80107]: pgmap v5: 197 pgs: 197 active+clean; 454 KiB data, 85 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:42:40 compute-1 ceph-mon[80107]: from='client.14478 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 09:42:40 compute-1 ceph-mon[80107]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 26 09:42:40 compute-1 ceph-mon[80107]: 5.15 scrub starts
Jan 26 09:42:40 compute-1 ceph-mon[80107]: 5.15 scrub ok
Jan 26 09:42:40 compute-1 ceph-mon[80107]: mgrmap e24: compute-0.zllcia(active, since 2s), standbys: compute-1.xammti, compute-2.oynaeu
Jan 26 09:42:40 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:40 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:40 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 26 09:42:40 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]: dispatch
Jan 26 09:42:40 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:40 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:40 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 26 09:42:40 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:40 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:40 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 26 09:42:40 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:42:40 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 09:42:40 compute-1 sudo[84605]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:40 compute-1 sudo[84630]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.conf.new
Jan 26 09:42:40 compute-1 sudo[84630]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:40 compute-1 sudo[84630]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:40 compute-1 sudo[84655]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Jan 26 09:42:40 compute-1 sudo[84655]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:40 compute-1 sudo[84655]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:40 compute-1 sudo[84680]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config
Jan 26 09:42:40 compute-1 sudo[84680]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:40 compute-1 sudo[84680]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:40 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:42:40 compute-1 sudo[84705]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config
Jan 26 09:42:40 compute-1 sudo[84705]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:40 compute-1 sudo[84705]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:40 compute-1 sudo[84730]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf.new
Jan 26 09:42:40 compute-1 sudo[84730]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:40 compute-1 sudo[84730]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:40 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e58 e58: 3 total, 3 up, 3 in
Jan 26 09:42:40 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 58 pg[12.0( empty local-lis/les=0/0 n=0 ec=58/58 lis/c=0/0 les/c/f=0/0/0 sis=58) [1] r=0 lpr=58 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:42:40 compute-1 sudo[84755]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:42:40 compute-1 sudo[84755]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:40 compute-1 sudo[84755]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:40 compute-1 sudo[84780]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf.new
Jan 26 09:42:40 compute-1 sudo[84780]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:40 compute-1 sudo[84780]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:40 compute-1 sudo[84828]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf.new
Jan 26 09:42:40 compute-1 sudo[84828]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:40 compute-1 sudo[84828]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:40 compute-1 sudo[84853]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf.new
Jan 26 09:42:40 compute-1 sudo[84853]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:40 compute-1 sudo[84853]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:41 compute-1 sudo[84878]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf.new /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 09:42:41 compute-1 sudo[84878]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:41 compute-1 sudo[84878]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:41 compute-1 sudo[84903]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Jan 26 09:42:41 compute-1 sudo[84903]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:41 compute-1 sudo[84903]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:41 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 4.13 scrub starts
Jan 26 09:42:41 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 4.13 scrub ok
Jan 26 09:42:41 compute-1 sudo[84928]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph
Jan 26 09:42:41 compute-1 sudo[84928]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:41 compute-1 sudo[84928]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:41 compute-1 sudo[84953]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.client.admin.keyring.new
Jan 26 09:42:41 compute-1 sudo[84953]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:41 compute-1 sudo[84953]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:41 compute-1 sudo[84978]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:42:41 compute-1 sudo[84978]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:41 compute-1 sudo[84978]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:41 compute-1 sudo[85003]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.client.admin.keyring.new
Jan 26 09:42:41 compute-1 sudo[85003]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:41 compute-1 sudo[85003]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:41 compute-1 sudo[85051]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.client.admin.keyring.new
Jan 26 09:42:41 compute-1 sudo[85051]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:41 compute-1 sudo[85051]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:41 compute-1 ceph-mon[80107]: 2.c scrub starts
Jan 26 09:42:41 compute-1 ceph-mon[80107]: 2.c scrub ok
Jan 26 09:42:41 compute-1 ceph-mon[80107]: 2.6 scrub starts
Jan 26 09:42:41 compute-1 ceph-mon[80107]: 2.6 scrub ok
Jan 26 09:42:41 compute-1 ceph-mon[80107]: from='client.14490 -' entity='client.admin' cmd=[{"prefix": "nfs cluster create", "cluster_id": "cephfs", "ingress": true, "virtual_ip": "192.168.122.2/24", "ingress_mode": "haproxy-protocol", "placement": "compute-0 compute-1 compute-2 ", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 09:42:41 compute-1 ceph-mon[80107]: Updating compute-0:/etc/ceph/ceph.conf
Jan 26 09:42:41 compute-1 ceph-mon[80107]: Updating compute-1:/etc/ceph/ceph.conf
Jan 26 09:42:41 compute-1 ceph-mon[80107]: Updating compute-2:/etc/ceph/ceph.conf
Jan 26 09:42:41 compute-1 ceph-mon[80107]: 3.14 scrub starts
Jan 26 09:42:41 compute-1 ceph-mon[80107]: 3.14 scrub ok
Jan 26 09:42:41 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]': finished
Jan 26 09:42:41 compute-1 ceph-mon[80107]: osdmap e58: 3 total, 3 up, 3 in
Jan 26 09:42:41 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]: dispatch
Jan 26 09:42:41 compute-1 sudo[85076]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.client.admin.keyring.new
Jan 26 09:42:41 compute-1 sudo[85076]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:41 compute-1 sudo[85076]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:41 compute-1 sudo[85101]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Jan 26 09:42:41 compute-1 sudo[85101]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:41 compute-1 sudo[85101]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:41 compute-1 sudo[85126]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config
Jan 26 09:42:41 compute-1 sudo[85126]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:41 compute-1 sudo[85126]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:41 compute-1 sudo[85151]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config
Jan 26 09:42:41 compute-1 sudo[85151]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:41 compute-1 sudo[85151]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:41 compute-1 sudo[85176]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring.new
Jan 26 09:42:41 compute-1 sudo[85176]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:41 compute-1 sudo[85176]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:41 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e59 e59: 3 total, 3 up, 3 in
Jan 26 09:42:41 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 59 pg[12.0( empty local-lis/les=58/59 n=0 ec=58/58 lis/c=0/0 les/c/f=0/0/0 sis=58) [1] r=0 lpr=58 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:42:41 compute-1 sudo[85201]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:42:41 compute-1 sudo[85201]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:41 compute-1 sudo[85201]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:41 compute-1 sudo[85226]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring.new
Jan 26 09:42:41 compute-1 sudo[85226]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:41 compute-1 sudo[85226]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:41 compute-1 sudo[85274]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring.new
Jan 26 09:42:41 compute-1 sudo[85274]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:41 compute-1 sudo[85274]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:42 compute-1 sudo[85299]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring.new
Jan 26 09:42:42 compute-1 sudo[85299]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:42 compute-1 sudo[85299]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:42 compute-1 sudo[85324]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring.new /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring
Jan 26 09:42:42 compute-1 sudo[85324]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:42 compute-1 sudo[85324]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:42 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.16 deep-scrub starts
Jan 26 09:42:42 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.16 deep-scrub ok
Jan 26 09:42:42 compute-1 ceph-mon[80107]: 4.3 scrub starts
Jan 26 09:42:42 compute-1 ceph-mon[80107]: 4.3 scrub ok
Jan 26 09:42:42 compute-1 ceph-mon[80107]: Updating compute-1:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 09:42:42 compute-1 ceph-mon[80107]: Updating compute-2:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 09:42:42 compute-1 ceph-mon[80107]: 3.6 deep-scrub starts
Jan 26 09:42:42 compute-1 ceph-mon[80107]: 3.6 deep-scrub ok
Jan 26 09:42:42 compute-1 ceph-mon[80107]: pgmap v6: 197 pgs: 197 active+clean; 454 KiB data, 85 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:42:42 compute-1 ceph-mon[80107]: Updating compute-0:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 09:42:42 compute-1 ceph-mon[80107]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Jan 26 09:42:42 compute-1 ceph-mon[80107]: 4.13 scrub starts
Jan 26 09:42:42 compute-1 ceph-mon[80107]: 4.13 scrub ok
Jan 26 09:42:42 compute-1 ceph-mon[80107]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 26 09:42:42 compute-1 ceph-mon[80107]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Jan 26 09:42:42 compute-1 ceph-mon[80107]: mgrmap e25: compute-0.zllcia(active, since 4s), standbys: compute-1.xammti, compute-2.oynaeu
Jan 26 09:42:42 compute-1 ceph-mon[80107]: 4.2 scrub starts
Jan 26 09:42:42 compute-1 ceph-mon[80107]: 4.2 scrub ok
Jan 26 09:42:42 compute-1 ceph-mon[80107]: 5.5 scrub starts
Jan 26 09:42:42 compute-1 ceph-mon[80107]: 5.5 scrub ok
Jan 26 09:42:42 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]': finished
Jan 26 09:42:42 compute-1 ceph-mon[80107]: osdmap e59: 3 total, 3 up, 3 in
Jan 26 09:42:42 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:42 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:42 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:42 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:42 compute-1 sudo[85349]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:42:42 compute-1 sudo[85349]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:42 compute-1 sudo[85349]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:42 compute-1 sudo[85374]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/prometheus/node-exporter:v1.7.0 --timeout 895 _orch deploy --fsid 1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:42:42 compute-1 sudo[85374]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e60 e60: 3 total, 3 up, 3 in
Jan 26 09:42:43 compute-1 systemd[1]: Reloading.
Jan 26 09:42:43 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.13 deep-scrub starts
Jan 26 09:42:43 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.13 deep-scrub ok
Jan 26 09:42:43 compute-1 systemd-rc-local-generator[85470]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:42:43 compute-1 systemd-sysv-generator[85474]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:42:43 compute-1 systemd[1]: Reloading.
Jan 26 09:42:43 compute-1 ceph-mon[80107]: Updating compute-1:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring
Jan 26 09:42:43 compute-1 ceph-mon[80107]: Saving service nfs.cephfs spec with placement compute-0;compute-1;compute-2
Jan 26 09:42:43 compute-1 ceph-mon[80107]: Updating compute-0:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring
Jan 26 09:42:43 compute-1 ceph-mon[80107]: Saving service ingress.nfs.cephfs spec with placement compute-0;compute-1;compute-2
Jan 26 09:42:43 compute-1 ceph-mon[80107]: Updating compute-2:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring
Jan 26 09:42:43 compute-1 ceph-mon[80107]: 5.16 deep-scrub starts
Jan 26 09:42:43 compute-1 ceph-mon[80107]: 5.16 deep-scrub ok
Jan 26 09:42:43 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:43 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:43 compute-1 ceph-mon[80107]: 2.5 scrub starts
Jan 26 09:42:43 compute-1 ceph-mon[80107]: 2.5 scrub ok
Jan 26 09:42:43 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:43 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:43 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:43 compute-1 ceph-mon[80107]: 3.2 scrub starts
Jan 26 09:42:43 compute-1 ceph-mon[80107]: 3.2 scrub ok
Jan 26 09:42:43 compute-1 ceph-mon[80107]: osdmap e60: 3 total, 3 up, 3 in
Jan 26 09:42:43 compute-1 systemd-sysv-generator[85511]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:42:43 compute-1 systemd-rc-local-generator[85506]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:42:43 compute-1 systemd[1]: Starting Ceph node-exporter.compute-1 for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 09:42:43 compute-1 bash[85564]: Trying to pull quay.io/prometheus/node-exporter:v1.7.0...
Jan 26 09:42:44 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.11 scrub starts
Jan 26 09:42:44 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.11 scrub ok
Jan 26 09:42:44 compute-1 ceph-mon[80107]: Deploying daemon node-exporter.compute-1 on compute-1
Jan 26 09:42:44 compute-1 ceph-mon[80107]: pgmap v9: 198 pgs: 1 unknown, 197 active+clean; 454 KiB data, 85 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:42:44 compute-1 ceph-mon[80107]: 3.13 deep-scrub starts
Jan 26 09:42:44 compute-1 ceph-mon[80107]: 3.13 deep-scrub ok
Jan 26 09:42:44 compute-1 ceph-mon[80107]: 2.b scrub starts
Jan 26 09:42:44 compute-1 ceph-mon[80107]: 2.b scrub ok
Jan 26 09:42:44 compute-1 ceph-mon[80107]: 5.a deep-scrub starts
Jan 26 09:42:44 compute-1 ceph-mon[80107]: 5.a deep-scrub ok
Jan 26 09:42:44 compute-1 ceph-mon[80107]: mgrmap e26: compute-0.zllcia(active, since 7s), standbys: compute-1.xammti, compute-2.oynaeu
Jan 26 09:42:44 compute-1 bash[85564]: Getting image source signatures
Jan 26 09:42:44 compute-1 bash[85564]: Copying blob sha256:324153f2810a9927fcce320af9e4e291e0b6e805cbdd1f338386c756b9defa24
Jan 26 09:42:44 compute-1 bash[85564]: Copying blob sha256:2abcce694348cd2c949c0e98a7400ebdfd8341021bcf6b541bc72033ce982510
Jan 26 09:42:44 compute-1 bash[85564]: Copying blob sha256:455fd88e5221bc1e278ef2d059cd70e4df99a24e5af050ede621534276f6cf9a
Jan 26 09:42:45 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.10 scrub starts
Jan 26 09:42:45 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.10 scrub ok
Jan 26 09:42:45 compute-1 bash[85564]: Copying config sha256:72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e
Jan 26 09:42:45 compute-1 bash[85564]: Writing manifest to image destination
Jan 26 09:42:45 compute-1 podman[85564]: 2026-01-26 09:42:45.395366304 +0000 UTC m=+1.446981141 container create 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 09:42:45 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eba8c33c607fbcb65ee9cea632dfe2337902fa89cd4980b43909b2c88fc1600c/merged/etc/node-exporter supports timestamps until 2038 (0x7fffffff)
Jan 26 09:42:45 compute-1 podman[85564]: 2026-01-26 09:42:45.448808443 +0000 UTC m=+1.500423330 container init 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 09:42:45 compute-1 podman[85564]: 2026-01-26 09:42:45.453902505 +0000 UTC m=+1.505517342 container start 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 09:42:45 compute-1 bash[85564]: 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a
Jan 26 09:42:45 compute-1 podman[85564]: 2026-01-26 09:42:45.381200911 +0000 UTC m=+1.432815768 image pull 72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e quay.io/prometheus/node-exporter:v1.7.0
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.460Z caller=node_exporter.go:192 level=info msg="Starting node_exporter" version="(version=1.7.0, branch=HEAD, revision=7333465abf9efba81876303bb57e6fadb946041b)"
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.460Z caller=node_exporter.go:193 level=info msg="Build context" build_context="(go=go1.21.4, platform=linux/amd64, user=root@35918982f6d8, date=20231112-23:53:35, tags=netgo osusergo static_build)"
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.461Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.461Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.461Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.461Z caller=diskstats_linux.go:265 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=arp
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=bcache
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=bonding
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=btrfs
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=conntrack
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=cpu
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=cpufreq
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=diskstats
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=dmi
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=edac
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=entropy
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=fibrechannel
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=filefd
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=filesystem
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=hwmon
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=infiniband
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=ipvs
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=loadavg
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=mdadm
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=meminfo
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=netclass
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=netdev
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=netstat
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=nfs
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=nfsd
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=nvme
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=os
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=powersupplyclass
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=pressure
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=rapl
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=schedstat
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=selinux
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=sockstat
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=softnet
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=stat
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=tapestats
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=textfile
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=thermal_zone
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=time
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=udp_queues
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=uname
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=vmstat
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=xfs
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=zfs
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.464Z caller=tls_config.go:274 level=info msg="Listening on" address=[::]:9100
Jan 26 09:42:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.464Z caller=tls_config.go:277 level=info msg="TLS is disabled." http2=false address=[::]:9100
Jan 26 09:42:45 compute-1 systemd[1]: Started Ceph node-exporter.compute-1 for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:42:45 compute-1 sudo[85374]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:45 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:42:45 compute-1 ceph-mon[80107]: 5.11 scrub starts
Jan 26 09:42:45 compute-1 ceph-mon[80107]: 5.11 scrub ok
Jan 26 09:42:45 compute-1 ceph-mon[80107]: 2.a scrub starts
Jan 26 09:42:45 compute-1 ceph-mon[80107]: 2.a scrub ok
Jan 26 09:42:45 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/234657791' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Jan 26 09:42:45 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/234657791' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Jan 26 09:42:45 compute-1 ceph-mon[80107]: 2.e deep-scrub starts
Jan 26 09:42:45 compute-1 ceph-mon[80107]: 2.e deep-scrub ok
Jan 26 09:42:46 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.1f scrub starts
Jan 26 09:42:46 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.1f scrub ok
Jan 26 09:42:46 compute-1 ceph-mon[80107]: pgmap v11: 198 pgs: 198 active+clean; 454 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 29 KiB/s rd, 0 B/s wr, 12 op/s
Jan 26 09:42:46 compute-1 ceph-mon[80107]: 5.10 scrub starts
Jan 26 09:42:46 compute-1 ceph-mon[80107]: 5.10 scrub ok
Jan 26 09:42:46 compute-1 ceph-mon[80107]: 4.19 deep-scrub starts
Jan 26 09:42:46 compute-1 ceph-mon[80107]: 4.19 deep-scrub ok
Jan 26 09:42:46 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:46 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:46 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:46 compute-1 ceph-mon[80107]: Deploying daemon node-exporter.compute-2 on compute-2
Jan 26 09:42:46 compute-1 ceph-mon[80107]: 5.1d scrub starts
Jan 26 09:42:46 compute-1 ceph-mon[80107]: 5.1d scrub ok
Jan 26 09:42:46 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2890019561' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Jan 26 09:42:47 compute-1 sshd-session[85649]: Connection closed by authenticating user root 152.42.136.110 port 59028 [preauth]
Jan 26 09:42:47 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.16 deep-scrub starts
Jan 26 09:42:47 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.16 deep-scrub ok
Jan 26 09:42:47 compute-1 ceph-mon[80107]: 5.1f scrub starts
Jan 26 09:42:47 compute-1 ceph-mon[80107]: 5.1f scrub ok
Jan 26 09:42:47 compute-1 ceph-mon[80107]: 4.9 scrub starts
Jan 26 09:42:47 compute-1 ceph-mon[80107]: 4.9 scrub ok
Jan 26 09:42:47 compute-1 ceph-mon[80107]: pgmap v12: 198 pgs: 198 active+clean; 454 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 29 KiB/s rd, 0 B/s wr, 12 op/s
Jan 26 09:42:47 compute-1 ceph-mon[80107]: 5.6 scrub starts
Jan 26 09:42:47 compute-1 ceph-mon[80107]: 5.6 scrub ok
Jan 26 09:42:47 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/4072035321' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 09:42:48 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.2 scrub starts
Jan 26 09:42:48 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.2 scrub ok
Jan 26 09:42:48 compute-1 ceph-mon[80107]: 3.16 deep-scrub starts
Jan 26 09:42:48 compute-1 ceph-mon[80107]: 3.16 deep-scrub ok
Jan 26 09:42:48 compute-1 ceph-mon[80107]: 6.1 scrub starts
Jan 26 09:42:48 compute-1 ceph-mon[80107]: 6.1 scrub ok
Jan 26 09:42:48 compute-1 ceph-mon[80107]: 3.19 scrub starts
Jan 26 09:42:48 compute-1 ceph-mon[80107]: 3.19 scrub ok
Jan 26 09:42:48 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:48 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:48 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:48 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:48 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 09:42:48 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 09:42:48 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:42:49 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.1 scrub starts
Jan 26 09:42:49 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.1 scrub ok
Jan 26 09:42:49 compute-1 ceph-mon[80107]: 5.2 scrub starts
Jan 26 09:42:49 compute-1 ceph-mon[80107]: 5.2 scrub ok
Jan 26 09:42:49 compute-1 ceph-mon[80107]: 6.1b scrub starts
Jan 26 09:42:49 compute-1 ceph-mon[80107]: 6.1b scrub ok
Jan 26 09:42:49 compute-1 ceph-mon[80107]: 3.18 scrub starts
Jan 26 09:42:49 compute-1 ceph-mon[80107]: 3.18 scrub ok
Jan 26 09:42:49 compute-1 ceph-mon[80107]: pgmap v13: 198 pgs: 198 active+clean; 454 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 0 B/s wr, 9 op/s
Jan 26 09:42:49 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/113232953' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Jan 26 09:42:50 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.3 scrub starts
Jan 26 09:42:50 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.3 scrub ok
Jan 26 09:42:50 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:42:50 compute-1 ceph-mon[80107]: 5.1 scrub starts
Jan 26 09:42:50 compute-1 ceph-mon[80107]: 5.1 scrub ok
Jan 26 09:42:51 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.f scrub starts
Jan 26 09:42:51 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.f scrub ok
Jan 26 09:42:51 compute-1 ceph-mon[80107]: 2.19 scrub starts
Jan 26 09:42:51 compute-1 ceph-mon[80107]: 2.19 scrub ok
Jan 26 09:42:51 compute-1 ceph-mon[80107]: 3.3 scrub starts
Jan 26 09:42:51 compute-1 ceph-mon[80107]: 3.3 scrub ok
Jan 26 09:42:51 compute-1 ceph-mon[80107]: pgmap v14: 198 pgs: 198 active+clean; 454 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 0 B/s wr, 8 op/s
Jan 26 09:42:51 compute-1 ceph-mon[80107]: 5.1e scrub starts
Jan 26 09:42:51 compute-1 ceph-mon[80107]: 5.1e scrub ok
Jan 26 09:42:51 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:51 compute-1 ceph-mon[80107]: 3.1 scrub starts
Jan 26 09:42:51 compute-1 ceph-mon[80107]: 3.1 scrub ok
Jan 26 09:42:52 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.18 scrub starts
Jan 26 09:42:52 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.18 scrub ok
Jan 26 09:42:52 compute-1 ceph-mon[80107]: 5.f scrub starts
Jan 26 09:42:52 compute-1 ceph-mon[80107]: 5.f scrub ok
Jan 26 09:42:52 compute-1 ceph-mon[80107]: from='client.14526 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Jan 26 09:42:52 compute-1 ceph-mon[80107]: 7.2 scrub starts
Jan 26 09:42:52 compute-1 ceph-mon[80107]: 7.2 scrub ok
Jan 26 09:42:53 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.1b deep-scrub starts
Jan 26 09:42:53 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.1b deep-scrub ok
Jan 26 09:42:54 compute-1 ceph-mon[80107]: 5.18 scrub starts
Jan 26 09:42:54 compute-1 ceph-mon[80107]: 5.18 scrub ok
Jan 26 09:42:54 compute-1 ceph-mon[80107]: pgmap v15: 198 pgs: 198 active+clean; 454 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 0 B/s wr, 7 op/s
Jan 26 09:42:54 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:54 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:54 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.qkzyup", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 26 09:42:54 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.qkzyup", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 26 09:42:54 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:54 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:42:54 compute-1 ceph-mon[80107]: Deploying daemon rgw.rgw.compute-0.qkzyup on compute-0
Jan 26 09:42:54 compute-1 ceph-mon[80107]: 7.6 scrub starts
Jan 26 09:42:54 compute-1 ceph-mon[80107]: 7.6 scrub ok
Jan 26 09:42:54 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.15 scrub starts
Jan 26 09:42:54 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.15 scrub ok
Jan 26 09:42:55 compute-1 ceph-mon[80107]: 5.1b deep-scrub starts
Jan 26 09:42:55 compute-1 ceph-mon[80107]: 5.1b deep-scrub ok
Jan 26 09:42:55 compute-1 ceph-mon[80107]: 6.15 scrub starts
Jan 26 09:42:55 compute-1 ceph-mon[80107]: 6.15 scrub ok
Jan 26 09:42:55 compute-1 ceph-mon[80107]: from='client.14532 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Jan 26 09:42:55 compute-1 ceph-mon[80107]: 7.18 scrub starts
Jan 26 09:42:55 compute-1 ceph-mon[80107]: 7.18 scrub ok
Jan 26 09:42:55 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:55 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:55 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:55 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:55 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.a scrub starts
Jan 26 09:42:55 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.a scrub ok
Jan 26 09:42:55 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:42:56 compute-1 ceph-mon[80107]: pgmap v16: 198 pgs: 198 active+clean; 454 KiB data, 85 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:42:56 compute-1 ceph-mon[80107]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 26 09:42:56 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:56 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.zprrum", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 26 09:42:56 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.zprrum", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 26 09:42:56 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:42:56 compute-1 ceph-mon[80107]: Deploying daemon mds.cephfs.compute-2.zprrum on compute-2
Jan 26 09:42:56 compute-1 ceph-mon[80107]: 6.a scrub starts
Jan 26 09:42:56 compute-1 ceph-mon[80107]: 6.a scrub ok
Jan 26 09:42:56 compute-1 ceph-mon[80107]: 7.8 deep-scrub starts
Jan 26 09:42:56 compute-1 ceph-mon[80107]: 7.8 deep-scrub ok
Jan 26 09:42:56 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.5 scrub starts
Jan 26 09:42:56 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.5 scrub ok
Jan 26 09:42:57 compute-1 ceph-mon[80107]: from='client.14556 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Jan 26 09:42:57 compute-1 ceph-mon[80107]: 6.5 scrub starts
Jan 26 09:42:57 compute-1 ceph-mon[80107]: 6.5 scrub ok
Jan 26 09:42:57 compute-1 ceph-mon[80107]: 7.4 deep-scrub starts
Jan 26 09:42:57 compute-1 ceph-mon[80107]: 7.4 deep-scrub ok
Jan 26 09:42:57 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:57 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:57 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:57 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:57 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.zhqpiu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 26 09:42:57 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.zhqpiu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 26 09:42:57 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:42:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).mds e3 new map
Jan 26 09:42:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).mds e3 print_map
                                           e3
                                           btime 2026-01-26T09:42:57:034497+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        2
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-01-26T09:42:37.723319+0000
                                           modified        2026-01-26T09:42:37.723319+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        
                                           up        {}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-2.zprrum{-1:24220} state up:standby seq 1 addr [v2:192.168.122.102:6804/1987962990,v1:192.168.122.102:6805/1987962990] compat {c=[1],r=[1],i=[1fff]}]
Jan 26 09:42:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).mds e4 new map
Jan 26 09:42:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).mds e4 print_map
                                           e4
                                           btime 2026-01-26T09:42:57:061062+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        4
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-01-26T09:42:37.723319+0000
                                           modified        2026-01-26T09:42:57.061055+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24220}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                           [mds.cephfs.compute-2.zprrum{0:24220} state up:creating seq 1 addr [v2:192.168.122.102:6804/1987962990,v1:192.168.122.102:6805/1987962990] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
Jan 26 09:42:58 compute-1 ceph-mon[80107]: pgmap v17: 198 pgs: 198 active+clean; 454 KiB data, 85 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:42:58 compute-1 ceph-mon[80107]: Deploying daemon mds.cephfs.compute-0.zhqpiu on compute-0
Jan 26 09:42:58 compute-1 ceph-mon[80107]: mds.? [v2:192.168.122.102:6804/1987962990,v1:192.168.122.102:6805/1987962990] up:boot
Jan 26 09:42:58 compute-1 ceph-mon[80107]: daemon mds.cephfs.compute-2.zprrum assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Jan 26 09:42:58 compute-1 ceph-mon[80107]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Jan 26 09:42:58 compute-1 ceph-mon[80107]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Jan 26 09:42:58 compute-1 ceph-mon[80107]: Cluster is now healthy
Jan 26 09:42:58 compute-1 ceph-mon[80107]: fsmap cephfs:0 1 up:standby
Jan 26 09:42:58 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-2.zprrum"}]: dispatch
Jan 26 09:42:58 compute-1 ceph-mon[80107]: fsmap cephfs:1 {0=cephfs.compute-2.zprrum=up:creating}
Jan 26 09:42:58 compute-1 ceph-mon[80107]: daemon mds.cephfs.compute-2.zprrum is now active in filesystem cephfs as rank 0
Jan 26 09:42:58 compute-1 ceph-mon[80107]: from='client.14562 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Jan 26 09:42:58 compute-1 ceph-mon[80107]: 7.9 scrub starts
Jan 26 09:42:58 compute-1 ceph-mon[80107]: 7.9 scrub ok
Jan 26 09:42:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).mds e5 new map
Jan 26 09:42:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).mds e5 print_map
                                           e5
                                           btime 2026-01-26T09:42:58:074031+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-01-26T09:42:37.723319+0000
                                           modified        2026-01-26T09:42:58.074029+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24220}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 24220 members: 24220
                                           [mds.cephfs.compute-2.zprrum{0:24220} state up:active seq 2 addr [v2:192.168.122.102:6804/1987962990,v1:192.168.122.102:6805/1987962990] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
Jan 26 09:42:58 compute-1 sudo[85651]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:42:58 compute-1 sudo[85651]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:58 compute-1 sudo[85651]: pam_unix(sudo:session): session closed for user root
Jan 26 09:42:58 compute-1 sudo[85676]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:42:58 compute-1 sudo[85676]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:42:59 compute-1 podman[85742]: 2026-01-26 09:42:59.04377294 +0000 UTC m=+0.039613578 container create d1b15216db57108421fe62ffe42b7031ff968e38fa34850d834e3311659e2a31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_wilson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 09:42:59 compute-1 ceph-mon[80107]: mds.? [v2:192.168.122.102:6804/1987962990,v1:192.168.122.102:6805/1987962990] up:active
Jan 26 09:42:59 compute-1 ceph-mon[80107]: fsmap cephfs:1 {0=cephfs.compute-2.zprrum=up:active}
Jan 26 09:42:59 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:59 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:59 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:42:59 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.rbkelk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 26 09:42:59 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.rbkelk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 26 09:42:59 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:42:59 compute-1 ceph-mon[80107]: 7.b scrub starts
Jan 26 09:42:59 compute-1 ceph-mon[80107]: 7.b scrub ok
Jan 26 09:42:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/620644567' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Jan 26 09:42:59 compute-1 systemd[1]: Started libpod-conmon-d1b15216db57108421fe62ffe42b7031ff968e38fa34850d834e3311659e2a31.scope.
Jan 26 09:42:59 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).mds e6 new map
Jan 26 09:42:59 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).mds e6 print_map
                                           e6
                                           btime 2026-01-26T09:42:59:090306+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-01-26T09:42:37.723319+0000
                                           modified        2026-01-26T09:42:58.074029+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24220}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 24220 members: 24220
                                           [mds.cephfs.compute-2.zprrum{0:24220} state up:active seq 2 addr [v2:192.168.122.102:6804/1987962990,v1:192.168.122.102:6805/1987962990] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.zhqpiu{-1:14568} state up:standby seq 1 addr [v2:192.168.122.100:6806/4011782606,v1:192.168.122.100:6807/4011782606] compat {c=[1],r=[1],i=[1fff]}]
Jan 26 09:42:59 compute-1 podman[85742]: 2026-01-26 09:42:59.026401728 +0000 UTC m=+0.022242406 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:42:59 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).mds e7 new map
Jan 26 09:42:59 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).mds e7 print_map
                                           e7
                                           btime 2026-01-26T09:42:59:119781+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-01-26T09:42:37.723319+0000
                                           modified        2026-01-26T09:42:58.074029+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24220}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24220 members: 24220
                                           [mds.cephfs.compute-2.zprrum{0:24220} state up:active seq 2 addr [v2:192.168.122.102:6804/1987962990,v1:192.168.122.102:6805/1987962990] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.zhqpiu{-1:14568} state up:standby seq 1 addr [v2:192.168.122.100:6806/4011782606,v1:192.168.122.100:6807/4011782606] compat {c=[1],r=[1],i=[1fff]}]
Jan 26 09:42:59 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:42:59 compute-1 podman[85742]: 2026-01-26 09:42:59.156527712 +0000 UTC m=+0.152368390 container init d1b15216db57108421fe62ffe42b7031ff968e38fa34850d834e3311659e2a31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_wilson, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Jan 26 09:42:59 compute-1 podman[85742]: 2026-01-26 09:42:59.164425351 +0000 UTC m=+0.160265999 container start d1b15216db57108421fe62ffe42b7031ff968e38fa34850d834e3311659e2a31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_wilson, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1)
Jan 26 09:42:59 compute-1 podman[85742]: 2026-01-26 09:42:59.167462385 +0000 UTC m=+0.163303083 container attach d1b15216db57108421fe62ffe42b7031ff968e38fa34850d834e3311659e2a31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_wilson, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 09:42:59 compute-1 funny_wilson[85759]: 167 167
Jan 26 09:42:59 compute-1 systemd[1]: libpod-d1b15216db57108421fe62ffe42b7031ff968e38fa34850d834e3311659e2a31.scope: Deactivated successfully.
Jan 26 09:42:59 compute-1 conmon[85759]: conmon d1b15216db57108421fe <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d1b15216db57108421fe62ffe42b7031ff968e38fa34850d834e3311659e2a31.scope/container/memory.events
Jan 26 09:42:59 compute-1 podman[85742]: 2026-01-26 09:42:59.174013996 +0000 UTC m=+0.169854644 container died d1b15216db57108421fe62ffe42b7031ff968e38fa34850d834e3311659e2a31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_wilson, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:42:59 compute-1 systemd[1]: var-lib-containers-storage-overlay-504c794a301c97feb6f5d9e96dce7e817095e716761909643c10cdf3e8f63064-merged.mount: Deactivated successfully.
Jan 26 09:42:59 compute-1 podman[85742]: 2026-01-26 09:42:59.219961079 +0000 UTC m=+0.215801737 container remove d1b15216db57108421fe62ffe42b7031ff968e38fa34850d834e3311659e2a31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_wilson, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 09:42:59 compute-1 systemd[1]: libpod-conmon-d1b15216db57108421fe62ffe42b7031ff968e38fa34850d834e3311659e2a31.scope: Deactivated successfully.
Jan 26 09:42:59 compute-1 systemd[1]: Reloading.
Jan 26 09:42:59 compute-1 systemd-sysv-generator[85806]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:42:59 compute-1 systemd-rc-local-generator[85803]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:42:59 compute-1 systemd[1]: Reloading.
Jan 26 09:42:59 compute-1 systemd-rc-local-generator[85843]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:42:59 compute-1 systemd-sysv-generator[85846]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:42:59 compute-1 systemd[1]: Starting Ceph mds.cephfs.compute-1.rbkelk for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 09:43:00 compute-1 podman[85900]: 2026-01-26 09:43:00.055235279 +0000 UTC m=+0.042345883 container create 402cc330e5f0ab6e9d9a513bc9f6c24dd12bc82eabc190b23d3cfb24ed7852db (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mds-cephfs-compute-1-rbkelk, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Jan 26 09:43:00 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1698a2e20032296d1ccb8212699fab3c67b3d6b3c7b2f303b7c12ebd42dd93c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 09:43:00 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1698a2e20032296d1ccb8212699fab3c67b3d6b3c7b2f303b7c12ebd42dd93c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 09:43:00 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1698a2e20032296d1ccb8212699fab3c67b3d6b3c7b2f303b7c12ebd42dd93c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 09:43:00 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1698a2e20032296d1ccb8212699fab3c67b3d6b3c7b2f303b7c12ebd42dd93c/merged/var/lib/ceph/mds/ceph-cephfs.compute-1.rbkelk supports timestamps until 2038 (0x7fffffff)
Jan 26 09:43:00 compute-1 podman[85900]: 2026-01-26 09:43:00.116685341 +0000 UTC m=+0.103795985 container init 402cc330e5f0ab6e9d9a513bc9f6c24dd12bc82eabc190b23d3cfb24ed7852db (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mds-cephfs-compute-1-rbkelk, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325)
Jan 26 09:43:00 compute-1 podman[85900]: 2026-01-26 09:43:00.122034929 +0000 UTC m=+0.109145523 container start 402cc330e5f0ab6e9d9a513bc9f6c24dd12bc82eabc190b23d3cfb24ed7852db (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mds-cephfs-compute-1-rbkelk, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Jan 26 09:43:00 compute-1 bash[85900]: 402cc330e5f0ab6e9d9a513bc9f6c24dd12bc82eabc190b23d3cfb24ed7852db
Jan 26 09:43:00 compute-1 podman[85900]: 2026-01-26 09:43:00.038583218 +0000 UTC m=+0.025693832 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:43:00 compute-1 ceph-mon[80107]: Deploying daemon mds.cephfs.compute-1.rbkelk on compute-1
Jan 26 09:43:00 compute-1 ceph-mon[80107]: pgmap v18: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 53 KiB/s rd, 1.2 KiB/s wr, 85 op/s
Jan 26 09:43:00 compute-1 ceph-mon[80107]: mds.? [v2:192.168.122.100:6806/4011782606,v1:192.168.122.100:6807/4011782606] up:boot
Jan 26 09:43:00 compute-1 ceph-mon[80107]: fsmap cephfs:1 {0=cephfs.compute-2.zprrum=up:active} 1 up:standby
Jan 26 09:43:00 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-0.zhqpiu"}]: dispatch
Jan 26 09:43:00 compute-1 ceph-mon[80107]: fsmap cephfs:1 {0=cephfs.compute-2.zprrum=up:active} 1 up:standby
Jan 26 09:43:00 compute-1 ceph-mon[80107]: 7.1e scrub starts
Jan 26 09:43:00 compute-1 ceph-mon[80107]: 7.1e scrub ok
Jan 26 09:43:00 compute-1 systemd[1]: Started Ceph mds.cephfs.compute-1.rbkelk for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:43:00 compute-1 sudo[85676]: pam_unix(sudo:session): session closed for user root
Jan 26 09:43:00 compute-1 ceph-mds[85919]: set uid:gid to 167:167 (ceph:ceph)
Jan 26 09:43:00 compute-1 ceph-mds[85919]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mds, pid 2
Jan 26 09:43:00 compute-1 ceph-mds[85919]: main not setting numa affinity
Jan 26 09:43:00 compute-1 ceph-mds[85919]: pidfile_write: ignore empty --pid-file
Jan 26 09:43:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mds-cephfs-compute-1-rbkelk[85915]: starting mds.cephfs.compute-1.rbkelk at 
Jan 26 09:43:00 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Updating MDS map to version 7 from mon.2
Jan 26 09:43:00 compute-1 sudo[85938]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:43:00 compute-1 sudo[85938]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:43:00 compute-1 sudo[85938]: pam_unix(sudo:session): session closed for user root
Jan 26 09:43:00 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:43:00 compute-1 sudo[85963]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:43:00 compute-1 sudo[85963]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:43:01 compute-1 podman[86028]: 2026-01-26 09:43:01.057868744 +0000 UTC m=+0.042620652 container create f9a7a0a589cf141e7c263f7fefa150839991b3eb1036543f6edec3aa4f11a0d2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_chatterjee, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, io.buildah.version=1.40.1)
Jan 26 09:43:01 compute-1 systemd[1]: Started libpod-conmon-f9a7a0a589cf141e7c263f7fefa150839991b3eb1036543f6edec3aa4f11a0d2.scope.
Jan 26 09:43:01 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:43:01 compute-1 podman[86028]: 2026-01-26 09:43:01.041098119 +0000 UTC m=+0.025850037 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:43:01 compute-1 podman[86028]: 2026-01-26 09:43:01.152355599 +0000 UTC m=+0.137107537 container init f9a7a0a589cf141e7c263f7fefa150839991b3eb1036543f6edec3aa4f11a0d2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_chatterjee, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_REF=squid, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2)
Jan 26 09:43:01 compute-1 podman[86028]: 2026-01-26 09:43:01.1578089 +0000 UTC m=+0.142560798 container start f9a7a0a589cf141e7c263f7fefa150839991b3eb1036543f6edec3aa4f11a0d2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_chatterjee, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 09:43:01 compute-1 podman[86028]: 2026-01-26 09:43:01.1613667 +0000 UTC m=+0.146118598 container attach f9a7a0a589cf141e7c263f7fefa150839991b3eb1036543f6edec3aa4f11a0d2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_chatterjee, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325)
Jan 26 09:43:01 compute-1 clever_chatterjee[86044]: 167 167
Jan 26 09:43:01 compute-1 systemd[1]: libpod-f9a7a0a589cf141e7c263f7fefa150839991b3eb1036543f6edec3aa4f11a0d2.scope: Deactivated successfully.
Jan 26 09:43:01 compute-1 podman[86028]: 2026-01-26 09:43:01.164857736 +0000 UTC m=+0.149609644 container died f9a7a0a589cf141e7c263f7fefa150839991b3eb1036543f6edec3aa4f11a0d2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_chatterjee, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 09:43:01 compute-1 systemd[1]: var-lib-containers-storage-overlay-02d9d9ad2b40affc72d3464139b44c2e6e27a5882656f0071e6df85d150186ba-merged.mount: Deactivated successfully.
Jan 26 09:43:01 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:01 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:01 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:01 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:01 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:01 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:01 compute-1 ceph-mon[80107]: Creating key for client.nfs.cephfs.0.0.compute-1.thyhvc
Jan 26 09:43:01 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.thyhvc", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Jan 26 09:43:01 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.thyhvc", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Jan 26 09:43:01 compute-1 ceph-mon[80107]: Ensuring nfs.cephfs.0 is in the ganesha grace table
Jan 26 09:43:01 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Jan 26 09:43:01 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Jan 26 09:43:01 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:43:01 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Jan 26 09:43:01 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Jan 26 09:43:01 compute-1 ceph-mon[80107]: Rados config object exists: conf-nfs.cephfs
Jan 26 09:43:01 compute-1 ceph-mon[80107]: Creating key for client.nfs.cephfs.0.0.compute-1.thyhvc-rgw
Jan 26 09:43:01 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.thyhvc-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 26 09:43:01 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.thyhvc-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 26 09:43:01 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:43:01 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/121448194' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Jan 26 09:43:01 compute-1 podman[86028]: 2026-01-26 09:43:01.202467297 +0000 UTC m=+0.187219195 container remove f9a7a0a589cf141e7c263f7fefa150839991b3eb1036543f6edec3aa4f11a0d2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_chatterjee, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 09:43:01 compute-1 systemd[1]: libpod-conmon-f9a7a0a589cf141e7c263f7fefa150839991b3eb1036543f6edec3aa4f11a0d2.scope: Deactivated successfully.
Jan 26 09:43:01 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).mds e8 new map
Jan 26 09:43:01 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).mds e8 print_map
                                           e8
                                           btime 2026-01-26T09:43:01:199992+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        8
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-01-26T09:42:37.723319+0000
                                           modified        2026-01-26T09:43:01.101127+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24220}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24220 members: 24220
                                           [mds.cephfs.compute-2.zprrum{0:24220} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/1987962990,v1:192.168.122.102:6805/1987962990] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.zhqpiu{-1:14568} state up:standby seq 1 addr [v2:192.168.122.100:6806/4011782606,v1:192.168.122.100:6807/4011782606] compat {c=[1],r=[1],i=[1fff]}]
                                           [mds.cephfs.compute-1.rbkelk{-1:24194} state up:standby seq 1 addr [v2:192.168.122.101:6804/4143393925,v1:192.168.122.101:6805/4143393925] compat {c=[1],r=[1],i=[1fff]}]
Jan 26 09:43:01 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Updating MDS map to version 8 from mon.2
Jan 26 09:43:01 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Monitors have assigned me to become a standby
Jan 26 09:43:01 compute-1 systemd[1]: Reloading.
Jan 26 09:43:01 compute-1 systemd-rc-local-generator[86082]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:43:01 compute-1 systemd-sysv-generator[86090]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:43:01 compute-1 systemd[1]: Reloading.
Jan 26 09:43:01 compute-1 systemd-rc-local-generator[86129]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:43:01 compute-1 systemd-sysv-generator[86132]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:43:01 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 09:43:01 compute-1 podman[86185]: 2026-01-26 09:43:01.96164242 +0000 UTC m=+0.033980382 container create 19c026de89e744ddc79168cf6d35f36da879cfc6e36f4542a44b0a0f6b664c36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.40.1)
Jan 26 09:43:01 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/31ac196e154da090dbbc83357691609c1e8abfb8268037421008e730e1429aee/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 09:43:01 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/31ac196e154da090dbbc83357691609c1e8abfb8268037421008e730e1429aee/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 09:43:01 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/31ac196e154da090dbbc83357691609c1e8abfb8268037421008e730e1429aee/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 09:43:01 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/31ac196e154da090dbbc83357691609c1e8abfb8268037421008e730e1429aee/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 09:43:02 compute-1 podman[86185]: 2026-01-26 09:43:02.005069913 +0000 UTC m=+0.077407875 container init 19c026de89e744ddc79168cf6d35f36da879cfc6e36f4542a44b0a0f6b664c36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0)
Jan 26 09:43:02 compute-1 podman[86185]: 2026-01-26 09:43:02.010111113 +0000 UTC m=+0.082449075 container start 19c026de89e744ddc79168cf6d35f36da879cfc6e36f4542a44b0a0f6b664c36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:43:02 compute-1 bash[86185]: 19c026de89e744ddc79168cf6d35f36da879cfc6e36f4542a44b0a0f6b664c36
Jan 26 09:43:02 compute-1 podman[86185]: 2026-01-26 09:43:01.947318263 +0000 UTC m=+0.019656255 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:43:02 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:43:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 09:43:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 09:43:02 compute-1 sudo[85963]: pam_unix(sudo:session): session closed for user root
Jan 26 09:43:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 26 09:43:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 26 09:43:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 26 09:43:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 26 09:43:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 26 09:43:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:43:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Jan 26 09:43:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Jan 26 09:43:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:43:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:43:02 compute-1 ceph-mon[80107]: Bind address in nfs.cephfs.0.0.compute-1.thyhvc's ganesha conf is defaulting to empty
Jan 26 09:43:02 compute-1 ceph-mon[80107]: Deploying daemon nfs.cephfs.0.0.compute-1.thyhvc on compute-1
Jan 26 09:43:02 compute-1 ceph-mon[80107]: pgmap v19: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 53 KiB/s rd, 1.2 KiB/s wr, 85 op/s
Jan 26 09:43:02 compute-1 ceph-mon[80107]: mds.? [v2:192.168.122.101:6804/4143393925,v1:192.168.122.101:6805/4143393925] up:boot
Jan 26 09:43:02 compute-1 ceph-mon[80107]: mds.? [v2:192.168.122.102:6804/1987962990,v1:192.168.122.102:6805/1987962990] up:active
Jan 26 09:43:02 compute-1 ceph-mon[80107]: fsmap cephfs:1 {0=cephfs.compute-2.zprrum=up:active} 2 up:standby
Jan 26 09:43:02 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-1.rbkelk"}]: dispatch
Jan 26 09:43:02 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:02 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:02 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:02 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:02 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.najyrz", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Jan 26 09:43:02 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.najyrz", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Jan 26 09:43:02 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Jan 26 09:43:02 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Jan 26 09:43:02 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:43:03 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).mds e9 new map
Jan 26 09:43:03 compute-1 ceph-mon[80107]: Creating key for client.nfs.cephfs.1.0.compute-2.najyrz
Jan 26 09:43:03 compute-1 ceph-mon[80107]: Ensuring nfs.cephfs.1 is in the ganesha grace table
Jan 26 09:43:03 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/59313208' entity='client.admin' cmd=[{"prefix": "osd get-require-min-compat-client"}]: dispatch
Jan 26 09:43:03 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).mds e9 print_map
                                           e9
                                           btime 2026-01-26T09:43:03:225984+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        8
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-01-26T09:42:37.723319+0000
                                           modified        2026-01-26T09:43:01.101127+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24220}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24220 members: 24220
                                           [mds.cephfs.compute-2.zprrum{0:24220} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/1987962990,v1:192.168.122.102:6805/1987962990] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.zhqpiu{-1:14568} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/4011782606,v1:192.168.122.100:6807/4011782606] compat {c=[1],r=[1],i=[1fff]}]
                                           [mds.cephfs.compute-1.rbkelk{-1:24194} state up:standby seq 1 addr [v2:192.168.122.101:6804/4143393925,v1:192.168.122.101:6805/4143393925] compat {c=[1],r=[1],i=[1fff]}]
Jan 26 09:43:04 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).mds e10 new map
Jan 26 09:43:04 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).mds e10 print_map
                                           e10
                                           btime 2026-01-26T09:43:04:246912+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        8
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-01-26T09:42:37.723319+0000
                                           modified        2026-01-26T09:43:01.101127+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24220}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24220 members: 24220
                                           [mds.cephfs.compute-2.zprrum{0:24220} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/1987962990,v1:192.168.122.102:6805/1987962990] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.zhqpiu{-1:14568} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/4011782606,v1:192.168.122.100:6807/4011782606] compat {c=[1],r=[1],i=[1fff]}]
                                           [mds.cephfs.compute-1.rbkelk{-1:24194} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/4143393925,v1:192.168.122.101:6805/4143393925] compat {c=[1],r=[1],i=[1fff]}]
Jan 26 09:43:04 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Updating MDS map to version 10 from mon.2
Jan 26 09:43:04 compute-1 ceph-mon[80107]: pgmap v20: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 53 KiB/s rd, 1.2 KiB/s wr, 85 op/s
Jan 26 09:43:04 compute-1 ceph-mon[80107]: mds.? [v2:192.168.122.100:6806/4011782606,v1:192.168.122.100:6807/4011782606] up:standby
Jan 26 09:43:04 compute-1 ceph-mon[80107]: fsmap cephfs:1 {0=cephfs.compute-2.zprrum=up:active} 2 up:standby
Jan 26 09:43:04 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1572231238' entity='client.admin' cmd=[{"prefix": "versions", "format": "json"}]: dispatch
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000001:nfs.cephfs.0: -2
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 26 09:43:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 26 09:43:05 compute-1 ceph-mon[80107]: mds.? [v2:192.168.122.101:6804/4143393925,v1:192.168.122.101:6805/4143393925] up:standby
Jan 26 09:43:05 compute-1 ceph-mon[80107]: fsmap cephfs:1 {0=cephfs.compute-2.zprrum=up:active} 2 up:standby
Jan 26 09:43:05 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Jan 26 09:43:05 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Jan 26 09:43:05 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.najyrz-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 26 09:43:05 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.najyrz-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 26 09:43:05 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:43:05 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:43:06 compute-1 ceph-mon[80107]: pgmap v21: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 54 KiB/s rd, 1.9 KiB/s wr, 87 op/s
Jan 26 09:43:06 compute-1 ceph-mon[80107]: Rados config object exists: conf-nfs.cephfs
Jan 26 09:43:06 compute-1 ceph-mon[80107]: Creating key for client.nfs.cephfs.1.0.compute-2.najyrz-rgw
Jan 26 09:43:06 compute-1 ceph-mon[80107]: Bind address in nfs.cephfs.1.0.compute-2.najyrz's ganesha conf is defaulting to empty
Jan 26 09:43:06 compute-1 ceph-mon[80107]: Deploying daemon nfs.cephfs.1.0.compute-2.najyrz on compute-2
Jan 26 09:43:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:07 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:43:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:07 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:43:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:07 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:43:08 compute-1 ceph-mon[80107]: pgmap v22: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 54 KiB/s rd, 1.9 KiB/s wr, 87 op/s
Jan 26 09:43:08 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:08 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:08 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:08 compute-1 ceph-mon[80107]: Creating key for client.nfs.cephfs.2.0.compute-0.zfynkw
Jan 26 09:43:08 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.zfynkw", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Jan 26 09:43:08 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.zfynkw", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Jan 26 09:43:08 compute-1 ceph-mon[80107]: Ensuring nfs.cephfs.2 is in the ganesha grace table
Jan 26 09:43:08 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Jan 26 09:43:08 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Jan 26 09:43:08 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:43:10 compute-1 ceph-mon[80107]: pgmap v23: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 55 KiB/s rd, 2.7 KiB/s wr, 89 op/s
Jan 26 09:43:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:10 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 09:43:10 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:43:11 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Jan 26 09:43:11 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Jan 26 09:43:11 compute-1 ceph-mon[80107]: Rados config object exists: conf-nfs.cephfs
Jan 26 09:43:11 compute-1 ceph-mon[80107]: Creating key for client.nfs.cephfs.2.0.compute-0.zfynkw-rgw
Jan 26 09:43:11 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.zfynkw-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 26 09:43:11 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.zfynkw-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 26 09:43:11 compute-1 ceph-mon[80107]: Bind address in nfs.cephfs.2.0.compute-0.zfynkw's ganesha conf is defaulting to empty
Jan 26 09:43:11 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:43:11 compute-1 ceph-mon[80107]: Deploying daemon nfs.cephfs.2.0.compute-0.zfynkw on compute-0
Jan 26 09:43:12 compute-1 sudo[86254]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:43:12 compute-1 sudo[86254]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:43:12 compute-1 sudo[86254]: pam_unix(sudo:session): session closed for user root
Jan 26 09:43:12 compute-1 sudo[86279]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/haproxy:2.3 --timeout 895 _orch deploy --fsid 1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:43:12 compute-1 sudo[86279]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:43:12 compute-1 ceph-mon[80107]: pgmap v24: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1.5 KiB/s wr, 4 op/s
Jan 26 09:43:12 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:12 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:12 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:12 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:12 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:13 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:43:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:13 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:43:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:13 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:43:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:13 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:43:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:13 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:43:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:13 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 09:43:13 compute-1 ceph-mon[80107]: Deploying daemon haproxy.nfs.cephfs.compute-1.nsxfyf on compute-1
Jan 26 09:43:13 compute-1 ceph-mon[80107]: pgmap v25: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1.5 KiB/s wr, 4 op/s
Jan 26 09:43:15 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:43:15 compute-1 podman[86346]: 2026-01-26 09:43:15.811445735 +0000 UTC m=+3.195279905 container create cd82b8b685487e6869f6514c1b202d116d53f159dcea2ee92b192ddfd4e95772 (image=quay.io/ceph/haproxy:2.3, name=trusting_wescoff)
Jan 26 09:43:15 compute-1 podman[86346]: 2026-01-26 09:43:15.795727208 +0000 UTC m=+3.179561398 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 26 09:43:15 compute-1 systemd[1]: Started libpod-conmon-cd82b8b685487e6869f6514c1b202d116d53f159dcea2ee92b192ddfd4e95772.scope.
Jan 26 09:43:15 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:43:15 compute-1 podman[86346]: 2026-01-26 09:43:15.902710862 +0000 UTC m=+3.286545062 container init cd82b8b685487e6869f6514c1b202d116d53f159dcea2ee92b192ddfd4e95772 (image=quay.io/ceph/haproxy:2.3, name=trusting_wescoff)
Jan 26 09:43:15 compute-1 podman[86346]: 2026-01-26 09:43:15.910107173 +0000 UTC m=+3.293941363 container start cd82b8b685487e6869f6514c1b202d116d53f159dcea2ee92b192ddfd4e95772 (image=quay.io/ceph/haproxy:2.3, name=trusting_wescoff)
Jan 26 09:43:15 compute-1 podman[86346]: 2026-01-26 09:43:15.913557127 +0000 UTC m=+3.297391317 container attach cd82b8b685487e6869f6514c1b202d116d53f159dcea2ee92b192ddfd4e95772 (image=quay.io/ceph/haproxy:2.3, name=trusting_wescoff)
Jan 26 09:43:15 compute-1 trusting_wescoff[86470]: 0 0
Jan 26 09:43:15 compute-1 systemd[1]: libpod-cd82b8b685487e6869f6514c1b202d116d53f159dcea2ee92b192ddfd4e95772.scope: Deactivated successfully.
Jan 26 09:43:15 compute-1 conmon[86470]: conmon cd82b8b685487e6869f6 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-cd82b8b685487e6869f6514c1b202d116d53f159dcea2ee92b192ddfd4e95772.scope/container/memory.events
Jan 26 09:43:15 compute-1 podman[86346]: 2026-01-26 09:43:15.916519228 +0000 UTC m=+3.300353398 container died cd82b8b685487e6869f6514c1b202d116d53f159dcea2ee92b192ddfd4e95772 (image=quay.io/ceph/haproxy:2.3, name=trusting_wescoff)
Jan 26 09:43:15 compute-1 systemd[1]: var-lib-containers-storage-overlay-4fb6e6ec22f4f145ed668c0992bc0d2e45431868371b875bac4ac207ab24479c-merged.mount: Deactivated successfully.
Jan 26 09:43:15 compute-1 podman[86346]: 2026-01-26 09:43:15.951387017 +0000 UTC m=+3.335221187 container remove cd82b8b685487e6869f6514c1b202d116d53f159dcea2ee92b192ddfd4e95772 (image=quay.io/ceph/haproxy:2.3, name=trusting_wescoff)
Jan 26 09:43:15 compute-1 systemd[1]: libpod-conmon-cd82b8b685487e6869f6514c1b202d116d53f159dcea2ee92b192ddfd4e95772.scope: Deactivated successfully.
Jan 26 09:43:16 compute-1 systemd[1]: Reloading.
Jan 26 09:43:16 compute-1 ceph-mon[80107]: pgmap v26: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 2.8 KiB/s rd, 1.7 KiB/s wr, 5 op/s
Jan 26 09:43:16 compute-1 systemd-rc-local-generator[86518]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:43:16 compute-1 systemd-sysv-generator[86522]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:43:16 compute-1 systemd[1]: Reloading.
Jan 26 09:43:16 compute-1 systemd-rc-local-generator[86556]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:43:16 compute-1 systemd-sysv-generator[86559]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:43:16 compute-1 systemd[1]: Starting Ceph haproxy.nfs.cephfs.compute-1.nsxfyf for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 09:43:16 compute-1 podman[86614]: 2026-01-26 09:43:16.704856203 +0000 UTC m=+0.039610290 container create 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 09:43:16 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c30f8b227333eb121817e81f5fdeba5332091ff0a4a2104b1df0d5d22f450f1/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Jan 26 09:43:16 compute-1 podman[86614]: 2026-01-26 09:43:16.754335501 +0000 UTC m=+0.089089618 container init 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 09:43:16 compute-1 podman[86614]: 2026-01-26 09:43:16.759449281 +0000 UTC m=+0.094203368 container start 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 09:43:16 compute-1 bash[86614]: 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29
Jan 26 09:43:16 compute-1 podman[86614]: 2026-01-26 09:43:16.686866984 +0000 UTC m=+0.021621101 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 26 09:43:16 compute-1 systemd[1]: Started Ceph haproxy.nfs.cephfs.compute-1.nsxfyf for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:43:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [NOTICE] 025/094316 (2) : New worker #1 (4) forked
Jan 26 09:43:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:16 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a4000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:16 compute-1 sudo[86279]: pam_unix(sudo:session): session closed for user root
Jan 26 09:43:17 compute-1 ceph-mon[80107]: pgmap v27: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 09:43:17 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:17 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:17 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:17 compute-1 ceph-mon[80107]: Deploying daemon haproxy.nfs.cephfs.compute-0.eucyze on compute-0
Jan 26 09:43:17 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:18 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14940016c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:19 compute-1 sshd-session[86646]: Connection closed by authenticating user root 178.62.249.31 port 36858 [preauth]
Jan 26 09:43:19 compute-1 ceph-mon[80107]: pgmap v28: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 5.4 KiB/s rd, 1.8 KiB/s wr, 7 op/s
Jan 26 09:43:20 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:43:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:20 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:20 compute-1 sshd-session[86648]: Invalid user admin from 104.248.198.71 port 53360
Jan 26 09:43:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:20 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a0001230 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:20 compute-1 sshd-session[86648]: Connection closed by invalid user admin 104.248.198.71 port 53360 [preauth]
Jan 26 09:43:21 compute-1 ceph-mon[80107]: pgmap v29: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 4.0 KiB/s rd, 1.1 KiB/s wr, 5 op/s
Jan 26 09:43:21 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:21 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:21 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:21 compute-1 ceph-mon[80107]: Deploying daemon haproxy.nfs.cephfs.compute-2.rbycaf on compute-2
Jan 26 09:43:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:22 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a4001bd0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:22 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14940016c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:23 compute-1 ceph-mon[80107]: pgmap v30: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 4.0 KiB/s rd, 1.1 KiB/s wr, 5 op/s
Jan 26 09:43:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:24 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14880016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:24 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a0001d50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:24 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a4001bd0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:24 compute-1 sudo[86650]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:43:24 compute-1 sudo[86650]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:43:24 compute-1 sudo[86650]: pam_unix(sudo:session): session closed for user root
Jan 26 09:43:24 compute-1 sudo[86675]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/keepalived:2.2.4 --timeout 895 _orch deploy --fsid 1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:43:24 compute-1 sudo[86675]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:43:25 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:43:25 compute-1 ceph-mon[80107]: pgmap v31: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 4.0 KiB/s rd, 1.1 KiB/s wr, 5 op/s
Jan 26 09:43:25 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:25 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:25 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:25 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:25 compute-1 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 26 09:43:25 compute-1 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 26 09:43:25 compute-1 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 26 09:43:25 compute-1 ceph-mon[80107]: Deploying daemon keepalived.nfs.cephfs.compute-1.wvnxoh on compute-1
Jan 26 09:43:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:26 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14940016c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:26 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14880016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:26 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a0001d50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:27 compute-1 ceph-mon[80107]: pgmap v32: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 3.5 KiB/s rd, 938 B/s wr, 4 op/s
Jan 26 09:43:28 compute-1 podman[86740]: 2026-01-26 09:43:28.264663092 +0000 UTC m=+3.012077286 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 26 09:43:28 compute-1 podman[86740]: 2026-01-26 09:43:28.281560572 +0000 UTC m=+3.028974736 container create bffff4c47e045cdc9ecc4e594dd6b2800473993b8d114fb10e12cd08ac00b4a8 (image=quay.io/ceph/keepalived:2.2.4, name=elegant_dijkstra, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, name=keepalived, distribution-scope=public, release=1793, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, io.buildah.version=1.28.2, com.redhat.component=keepalived-container)
Jan 26 09:43:28 compute-1 systemd[1]: Started libpod-conmon-bffff4c47e045cdc9ecc4e594dd6b2800473993b8d114fb10e12cd08ac00b4a8.scope.
Jan 26 09:43:28 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:43:28 compute-1 podman[86740]: 2026-01-26 09:43:28.35381857 +0000 UTC m=+3.101232754 container init bffff4c47e045cdc9ecc4e594dd6b2800473993b8d114fb10e12cd08ac00b4a8 (image=quay.io/ceph/keepalived:2.2.4, name=elegant_dijkstra, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, distribution-scope=public, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, version=2.2.4, io.buildah.version=1.28.2, io.openshift.expose-services=, name=keepalived, com.redhat.component=keepalived-container, description=keepalived for Ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64)
Jan 26 09:43:28 compute-1 podman[86740]: 2026-01-26 09:43:28.364330257 +0000 UTC m=+3.111744421 container start bffff4c47e045cdc9ecc4e594dd6b2800473993b8d114fb10e12cd08ac00b4a8 (image=quay.io/ceph/keepalived:2.2.4, name=elegant_dijkstra, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, release=1793, vendor=Red Hat, Inc., description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, vcs-type=git, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9)
Jan 26 09:43:28 compute-1 podman[86740]: 2026-01-26 09:43:28.367902664 +0000 UTC m=+3.115316848 container attach bffff4c47e045cdc9ecc4e594dd6b2800473993b8d114fb10e12cd08ac00b4a8 (image=quay.io/ceph/keepalived:2.2.4, name=elegant_dijkstra, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, vcs-type=git, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., release=1793, vendor=Red Hat, Inc., description=keepalived for Ceph, io.openshift.expose-services=, distribution-scope=public, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Jan 26 09:43:28 compute-1 elegant_dijkstra[86835]: 0 0
Jan 26 09:43:28 compute-1 systemd[1]: libpod-bffff4c47e045cdc9ecc4e594dd6b2800473993b8d114fb10e12cd08ac00b4a8.scope: Deactivated successfully.
Jan 26 09:43:28 compute-1 podman[86740]: 2026-01-26 09:43:28.371435571 +0000 UTC m=+3.118849735 container died bffff4c47e045cdc9ecc4e594dd6b2800473993b8d114fb10e12cd08ac00b4a8 (image=quay.io/ceph/keepalived:2.2.4, name=elegant_dijkstra, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, architecture=x86_64, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793)
Jan 26 09:43:28 compute-1 systemd[1]: var-lib-containers-storage-overlay-dff235b961c542b0cea5919d5c474015d41b6a32017cd3bea6ea6ea8cdaddf45-merged.mount: Deactivated successfully.
Jan 26 09:43:28 compute-1 podman[86740]: 2026-01-26 09:43:28.423993312 +0000 UTC m=+3.171407476 container remove bffff4c47e045cdc9ecc4e594dd6b2800473993b8d114fb10e12cd08ac00b4a8 (image=quay.io/ceph/keepalived:2.2.4, name=elegant_dijkstra, io.openshift.expose-services=, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, vcs-type=git, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, version=2.2.4, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9)
Jan 26 09:43:28 compute-1 systemd[1]: libpod-conmon-bffff4c47e045cdc9ecc4e594dd6b2800473993b8d114fb10e12cd08ac00b4a8.scope: Deactivated successfully.
Jan 26 09:43:28 compute-1 systemd[1]: Reloading.
Jan 26 09:43:28 compute-1 systemd-sysv-generator[86886]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:43:28 compute-1 systemd-rc-local-generator[86882]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:43:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:28 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a40089d0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:28 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14940016c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:28 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14880016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:28 compute-1 systemd[1]: Reloading.
Jan 26 09:43:28 compute-1 systemd-sysv-generator[86929]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:43:28 compute-1 systemd-rc-local-generator[86925]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:43:29 compute-1 systemd[1]: Starting Ceph keepalived.nfs.cephfs.compute-1.wvnxoh for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 09:43:29 compute-1 podman[86979]: 2026-01-26 09:43:29.328238635 +0000 UTC m=+0.048363098 container create 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, release=1793, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, version=2.2.4, vcs-type=git, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=keepalived)
Jan 26 09:43:29 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8357d20c9b2e4fd9989a87363535dc76d373866c082f1d07506c1d70a54e2f5f/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 09:43:29 compute-1 podman[86979]: 2026-01-26 09:43:29.386777229 +0000 UTC m=+0.106901712 container init 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, architecture=x86_64, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, io.buildah.version=1.28.2, version=2.2.4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, description=keepalived for Ceph, release=1793, vendor=Red Hat, Inc.)
Jan 26 09:43:29 compute-1 podman[86979]: 2026-01-26 09:43:29.393655437 +0000 UTC m=+0.113779900 container start 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20, version=2.2.4, vcs-type=git, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1793, io.buildah.version=1.28.2, architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived)
Jan 26 09:43:29 compute-1 bash[86979]: 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268
Jan 26 09:43:29 compute-1 podman[86979]: 2026-01-26 09:43:29.305384172 +0000 UTC m=+0.025508655 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 26 09:43:29 compute-1 systemd[1]: Started Ceph keepalived.nfs.cephfs.compute-1.wvnxoh for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:43:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh[86994]: Mon Jan 26 09:43:29 2026: Starting Keepalived v2.2.4 (08/21,2021)
Jan 26 09:43:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh[86994]: Mon Jan 26 09:43:29 2026: Running on Linux 5.14.0-661.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026 (built for Linux 5.14.0)
Jan 26 09:43:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh[86994]: Mon Jan 26 09:43:29 2026: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Jan 26 09:43:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh[86994]: Mon Jan 26 09:43:29 2026: Configuration file /etc/keepalived/keepalived.conf
Jan 26 09:43:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh[86994]: Mon Jan 26 09:43:29 2026: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Jan 26 09:43:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh[86994]: Mon Jan 26 09:43:29 2026: Starting VRRP child process, pid=4
Jan 26 09:43:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh[86994]: Mon Jan 26 09:43:29 2026: Startup complete
Jan 26 09:43:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh[86994]: Mon Jan 26 09:43:29 2026: (VI_0) Entering BACKUP STATE (init)
Jan 26 09:43:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh[86994]: Mon Jan 26 09:43:29 2026: VRRP_Script(check_backend) succeeded
Jan 26 09:43:29 compute-1 sudo[86675]: pam_unix(sudo:session): session closed for user root
Jan 26 09:43:30 compute-1 ceph-mon[80107]: pgmap v33: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 3.7 KiB/s rd, 938 B/s wr, 4 op/s
Jan 26 09:43:30 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:30 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:30 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:30 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:43:30 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:30 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a0001d50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:30 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:30 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a40089d0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:30 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:30 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14940016c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:31 compute-1 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 26 09:43:31 compute-1 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 26 09:43:31 compute-1 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 26 09:43:31 compute-1 ceph-mon[80107]: Deploying daemon keepalived.nfs.cephfs.compute-0.orrhyj on compute-0
Jan 26 09:43:32 compute-1 sshd-session[87002]: Connection closed by authenticating user root 152.42.136.110 port 59088 [preauth]
Jan 26 09:43:32 compute-1 ceph-mon[80107]: pgmap v34: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:43:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:32 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:32 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00031e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:32 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a40096e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh[86994]: Mon Jan 26 09:43:33 2026: (VI_0) Entering MASTER STATE
Jan 26 09:43:33 compute-1 ceph-mon[80107]: pgmap v35: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:43:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:34 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14940016c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:34 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14940016c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:34 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00031e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:35 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:43:36 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:36 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:36 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:36 compute-1 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 26 09:43:36 compute-1 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 26 09:43:36 compute-1 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 26 09:43:36 compute-1 ceph-mon[80107]: Deploying daemon keepalived.nfs.cephfs.compute-2.ovafut on compute-2
Jan 26 09:43:36 compute-1 ceph-mon[80107]: pgmap v36: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:43:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:36 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a40096e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:36 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a40096e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:36 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a40096e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:37 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 09:43:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e61 e61: 3 total, 3 up, 3 in
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:37.373870) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420617374129, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 6974, "num_deletes": 257, "total_data_size": 18574666, "memory_usage": 19565664, "flush_reason": "Manual Compaction"}
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420617472296, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 11825899, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 6979, "table_properties": {"data_size": 11799325, "index_size": 16859, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8645, "raw_key_size": 83620, "raw_average_key_size": 24, "raw_value_size": 11732902, "raw_average_value_size": 3395, "num_data_blocks": 745, "num_entries": 3455, "num_filter_entries": 3455, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 1769420465, "file_creation_time": 1769420617, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 98473 microseconds, and 34322 cpu microseconds.
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:37.472383) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 11825899 bytes OK
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:37.472407) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:37.473836) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:37.473852) EVENT_LOG_v1 {"time_micros": 1769420617473847, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:37.473868) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 18537159, prev total WAL file size 18537159, number of live WAL files 2.
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:37.476793) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323533' seq:0, type:0; will stop at (end)
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(11MB) 8(1773B)]
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420617476866, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 11827672, "oldest_snapshot_seqno": -1}
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3202 keys, 11822698 bytes, temperature: kUnknown
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420617566792, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 11822698, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11796640, "index_size": 16924, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8069, "raw_key_size": 80115, "raw_average_key_size": 25, "raw_value_size": 11733307, "raw_average_value_size": 3664, "num_data_blocks": 746, "num_entries": 3202, "num_filter_entries": 3202, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769420617, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:37.567014) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 11822698 bytes
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:37.568956) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 131.4 rd, 131.4 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(11.3, 0.0 +0.0 blob) out(11.3 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3460, records dropped: 258 output_compression: NoCompression
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:37.568984) EVENT_LOG_v1 {"time_micros": 1769420617568971, "job": 4, "event": "compaction_finished", "compaction_time_micros": 90007, "compaction_time_cpu_micros": 22555, "output_level": 6, "num_output_files": 1, "total_output_size": 11822698, "num_input_records": 3460, "num_output_records": 3202, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420617571043, "job": 4, "event": "table_file_deletion", "file_number": 14}
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420617571100, "job": 4, "event": "table_file_deletion", "file_number": 8}
Jan 26 09:43:37 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:37.476735) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:43:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh[86994]: Mon Jan 26 09:43:37 2026: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Jan 26 09:43:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh[86994]: Mon Jan 26 09:43:37 2026: (VI_0) Entering BACKUP STATE
Jan 26 09:43:38 compute-1 ceph-mon[80107]: pgmap v37: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:43:38 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Jan 26 09:43:38 compute-1 ceph-mon[80107]: osdmap e61: 3 total, 3 up, 3 in
Jan 26 09:43:38 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 09:43:38 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e62 e62: 3 total, 3 up, 3 in
Jan 26 09:43:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:38 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00031e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:38 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:38 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:39 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Jan 26 09:43:39 compute-1 ceph-mon[80107]: osdmap e62: 3 total, 3 up, 3 in
Jan 26 09:43:39 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 09:43:39 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 09:43:39 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 09:43:39 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e63 e63: 3 total, 3 up, 3 in
Jan 26 09:43:40 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e63 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:43:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:40 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a40096e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:40 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:40 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:41 compute-1 ceph-mon[80107]: pgmap v40: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 383 B/s rd, 0 op/s
Jan 26 09:43:41 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Jan 26 09:43:41 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 09:43:41 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 09:43:41 compute-1 ceph-mon[80107]: osdmap e63: 3 total, 3 up, 3 in
Jan 26 09:43:41 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 09:43:41 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:41 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:41 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:41 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:41 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e64 e64: 3 total, 3 up, 3 in
Jan 26 09:43:41 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Jan 26 09:43:41 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:41.885763) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 09:43:41 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Jan 26 09:43:41 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420621885816, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 385, "num_deletes": 251, "total_data_size": 376159, "memory_usage": 384928, "flush_reason": "Manual Compaction"}
Jan 26 09:43:41 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Jan 26 09:43:41 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420621890957, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 248024, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6984, "largest_seqno": 7364, "table_properties": {"data_size": 245642, "index_size": 482, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 5757, "raw_average_key_size": 18, "raw_value_size": 240815, "raw_average_value_size": 759, "num_data_blocks": 21, "num_entries": 317, "num_filter_entries": 317, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420617, "oldest_key_time": 1769420617, "file_creation_time": 1769420621, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Jan 26 09:43:41 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 5224 microseconds, and 1353 cpu microseconds.
Jan 26 09:43:41 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 09:43:41 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:41.890993) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 248024 bytes OK
Jan 26 09:43:41 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:41.891011) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Jan 26 09:43:41 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:41.892417) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Jan 26 09:43:41 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:41.892431) EVENT_LOG_v1 {"time_micros": 1769420621892427, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 09:43:41 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:41.892446) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 09:43:41 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 373573, prev total WAL file size 373573, number of live WAL files 2.
Jan 26 09:43:41 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 09:43:41 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:41.892815) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Jan 26 09:43:41 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 09:43:41 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(242KB)], [15(11MB)]
Jan 26 09:43:41 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420621892853, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 12070722, "oldest_snapshot_seqno": -1}
Jan 26 09:43:42 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3000 keys, 10943808 bytes, temperature: kUnknown
Jan 26 09:43:42 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420622036680, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 10943808, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10919746, "index_size": 15441, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 7557, "raw_key_size": 76992, "raw_average_key_size": 25, "raw_value_size": 10860461, "raw_average_value_size": 3620, "num_data_blocks": 674, "num_entries": 3000, "num_filter_entries": 3000, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769420621, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Jan 26 09:43:42 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 09:43:42 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:42.036958) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 10943808 bytes
Jan 26 09:43:42 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:42.246158) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 83.9 rd, 76.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 11.3 +0.0 blob) out(10.4 +0.0 blob), read-write-amplify(92.8) write-amplify(44.1) OK, records in: 3519, records dropped: 519 output_compression: NoCompression
Jan 26 09:43:42 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:42.246230) EVENT_LOG_v1 {"time_micros": 1769420622246204, "job": 6, "event": "compaction_finished", "compaction_time_micros": 143945, "compaction_time_cpu_micros": 23740, "output_level": 6, "num_output_files": 1, "total_output_size": 10943808, "num_input_records": 3519, "num_output_records": 3000, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 09:43:42 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 09:43:42 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420622246547, "job": 6, "event": "table_file_deletion", "file_number": 17}
Jan 26 09:43:42 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 09:43:42 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420622249101, "job": 6, "event": "table_file_deletion", "file_number": 15}
Jan 26 09:43:42 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:41.892765) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:43:42 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:42.249225) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:43:42 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:42.249231) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:43:42 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:42.249233) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:43:42 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:42.249234) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:43:42 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:42.249237) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:43:42 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:42 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:42 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:42 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a400a3f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:42 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:42 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:42 compute-1 ceph-mon[80107]: Deploying daemon alertmanager.compute-0 on compute-0
Jan 26 09:43:42 compute-1 ceph-mon[80107]: pgmap v42: 260 pgs: 62 unknown, 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 26 09:43:42 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 09:43:42 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Jan 26 09:43:42 compute-1 ceph-mon[80107]: osdmap e64: 3 total, 3 up, 3 in
Jan 26 09:43:42 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 09:43:42 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e65 e65: 3 total, 3 up, 3 in
Jan 26 09:43:42 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 65 pg[10.0( v 60'48 (0'0,60'48] local-lis/les=50/51 n=8 ec=50/50 lis/c=50/50 les/c/f=51/51/0 sis=65 pruub=9.585060120s) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 60'47 mlcod 60'47 active pruub 210.508377075s@ mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:42 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 65 pg[10.0( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=50/50 lis/c=50/50 les/c/f=51/51/0 sis=65 pruub=9.585060120s) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 60'47 mlcod 0'0 unknown pruub 210.508377075s@ mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-mon[80107]: pgmap v44: 260 pgs: 62 unknown, 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 563 B/s rd, 0 op/s
Jan 26 09:43:43 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 09:43:43 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 09:43:43 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 09:43:43 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]': finished
Jan 26 09:43:43 compute-1 ceph-mon[80107]: osdmap e65: 3 total, 3 up, 3 in
Jan 26 09:43:43 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:43 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:43 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:43 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:43 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:43 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:43 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Jan 26 09:43:43 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:43 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e66 e66: 3 total, 3 up, 3 in
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1b( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.7( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.12( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.11( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.10( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1f( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1e( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1d( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1a( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.19( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1c( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.6( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.18( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.5( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.4( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.3( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.b( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.8( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.d( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.a( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.9( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.c( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.f( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.e( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1( v 60'48 (0'0,60'48] local-lis/les=50/51 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.2( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.14( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.13( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.15( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.16( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.17( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1b( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.12( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.11( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.7( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1f( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1e( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1a( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1d( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.19( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.6( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.18( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.5( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.4( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.3( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.b( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.10( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.9( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.a( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1c( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.c( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.d( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.8( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.0( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=50/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 60'47 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.e( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.13( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.14( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.15( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.2( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.16( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.f( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.17( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:44 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.1b scrub starts
Jan 26 09:43:44 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.1b scrub ok
Jan 26 09:43:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:44 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:44 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:44 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a400a3f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:44 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e67 e67: 3 total, 3 up, 3 in
Jan 26 09:43:44 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 67 pg[12.0( v 60'46 (0'0,60'46] local-lis/les=58/59 n=5 ec=58/58 lis/c=58/58 les/c/f=59/59/0 sis=67 pruub=8.900333405s) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 lcod 60'45 mlcod 60'45 active pruub 211.830657959s@ mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:44 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 67 pg[12.0( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=58/58 lis/c=58/58 les/c/f=59/59/0 sis=67 pruub=8.900333405s) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 lcod 60'45 mlcod 0'0 unknown pruub 211.830657959s@ mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:44 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1).collection(12.0_head 0x55d1702b86c0) operator()   moving buffer(0x55d170cba208 space 0x55d170cfe350 0x0~1000 clean)
Jan 26 09:43:44 compute-1 ceph-mon[80107]: Regenerating cephadm self-signed grafana TLS certificates
Jan 26 09:43:44 compute-1 ceph-mon[80107]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Jan 26 09:43:44 compute-1 ceph-mon[80107]: Deploying daemon grafana.compute-0 on compute-0
Jan 26 09:43:44 compute-1 ceph-mon[80107]: 8.12 scrub starts
Jan 26 09:43:44 compute-1 ceph-mon[80107]: 8.12 scrub ok
Jan 26 09:43:44 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 09:43:44 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 09:43:44 compute-1 ceph-mon[80107]: osdmap e66: 3 total, 3 up, 3 in
Jan 26 09:43:44 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 09:43:45 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.12 scrub starts
Jan 26 09:43:45 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.12 scrub ok
Jan 26 09:43:45 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e67 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:43:45 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e68 e68: 3 total, 3 up, 3 in
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.11( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.13( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.12( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.10( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.15( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.7( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.4( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=1 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.6( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.9( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.8( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.a( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.c( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.f( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.b( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.e( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.5( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=1 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.2( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=1 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.3( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=1 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.d( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1e( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1f( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1c( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1a( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1b( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.18( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.19( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.16( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.14( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.17( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1( v 60'46 (0'0,60'46] local-lis/les=58/59 n=1 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1d( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:45 compute-1 ceph-mon[80107]: 10.1b scrub starts
Jan 26 09:43:45 compute-1 ceph-mon[80107]: 10.1b scrub ok
Jan 26 09:43:45 compute-1 ceph-mon[80107]: 8.13 scrub starts
Jan 26 09:43:45 compute-1 ceph-mon[80107]: 8.13 scrub ok
Jan 26 09:43:45 compute-1 ceph-mon[80107]: pgmap v47: 322 pgs: 62 unknown, 64 peering, 196 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:43:45 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 09:43:45 compute-1 ceph-mon[80107]: osdmap e67: 3 total, 3 up, 3 in
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.13( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.12( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.7( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.15( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.11( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.6( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.9( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.a( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.8( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.4( v 60'46 (0'0,60'46] local-lis/les=67/68 n=1 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.c( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.f( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.10( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.b( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.e( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.5( v 60'46 (0'0,60'46] local-lis/les=67/68 n=1 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.3( v 60'46 (0'0,60'46] local-lis/les=67/68 n=1 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1f( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.2( v 60'46 (0'0,60'46] local-lis/les=67/68 n=1 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.0( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=58/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 60'45 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1a( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1c( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1e( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1b( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.18( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.19( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.16( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1( v 60'46 (0'0,60'46] local-lis/les=67/68 n=1 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.14( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1d( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.17( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.d( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:46 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.11 scrub starts
Jan 26 09:43:46 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.11 scrub ok
Jan 26 09:43:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:46 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:46 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:46 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:46 compute-1 ceph-mon[80107]: 10.12 scrub starts
Jan 26 09:43:46 compute-1 ceph-mon[80107]: 10.12 scrub ok
Jan 26 09:43:46 compute-1 ceph-mon[80107]: 9.10 scrub starts
Jan 26 09:43:46 compute-1 ceph-mon[80107]: 9.10 scrub ok
Jan 26 09:43:46 compute-1 ceph-mon[80107]: osdmap e68: 3 total, 3 up, 3 in
Jan 26 09:43:47 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.1f deep-scrub starts
Jan 26 09:43:47 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.1f deep-scrub ok
Jan 26 09:43:48 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.7 scrub starts
Jan 26 09:43:48 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.7 scrub ok
Jan 26 09:43:48 compute-1 ceph-mon[80107]: 10.11 scrub starts
Jan 26 09:43:48 compute-1 ceph-mon[80107]: 10.11 scrub ok
Jan 26 09:43:48 compute-1 ceph-mon[80107]: 8.11 scrub starts
Jan 26 09:43:48 compute-1 ceph-mon[80107]: 8.11 scrub ok
Jan 26 09:43:48 compute-1 ceph-mon[80107]: pgmap v50: 353 pgs: 93 unknown, 64 peering, 196 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:43:48 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:48 compute-1 ceph-mon[80107]: 8.10 deep-scrub starts
Jan 26 09:43:48 compute-1 ceph-mon[80107]: 8.10 deep-scrub ok
Jan 26 09:43:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:48 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a400a3f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:48 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:48 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:49 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.1a scrub starts
Jan 26 09:43:49 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.1a scrub ok
Jan 26 09:43:49 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e69 e69: 3 total, 3 up, 3 in
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.10( v 68'49 (0'0,68'49] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.211609840s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=68'47 lcod 68'48 mlcod 68'48 active pruub 219.969894409s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.15( v 66'51 (0'0,66'51] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.197128296s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 66'50 mlcod 66'50 active pruub 217.955444336s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.10( v 68'49 (0'0,68'49] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.211562157s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=68'47 lcod 68'48 mlcod 0'0 unknown NOTIFY pruub 219.969894409s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.15( v 66'51 (0'0,66'51] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.197095871s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 66'50 mlcod 0'0 unknown NOTIFY pruub 217.955444336s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.13( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.204886436s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.963439941s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.14( v 66'51 (0'0,66'51] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.196805000s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 66'50 mlcod 66'50 active pruub 217.955413818s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.12( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.204831123s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.963439941s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.13( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.204850197s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.963439941s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.12( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.204812050s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.963439941s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.14( v 66'51 (0'0,66'51] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.196767807s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 66'50 mlcod 0'0 unknown NOTIFY pruub 217.955413818s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.13( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.196599960s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.955383301s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.13( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.196578979s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.955383301s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.2( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.196598053s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.955474854s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.4( v 60'46 (0'0,60'46] local-lis/les=67/68 n=1 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210849762s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.969787598s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.2( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.196548462s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.955474854s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.4( v 60'46 (0'0,60'46] local-lis/les=67/68 n=1 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210830688s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.969787598s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.1( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.196307182s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.955322266s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.7( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210350990s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.969390869s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.1( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.196286201s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.955322266s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.6( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210316658s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.969497681s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.6( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210298538s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.969497681s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.f( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.196280479s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.955535889s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.f( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.196266174s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.955535889s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.9( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210173607s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.969512939s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.9( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210156441s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.969512939s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.8( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210277557s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.969680786s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.8( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210260391s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.969680786s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.a( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210008621s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.969558716s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.a( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209998131s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.969558716s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.7( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210334778s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.969390869s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.b( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210254669s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.969909668s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.b( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210244179s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.969909668s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.c( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210129738s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.969818115s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.e( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210205078s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.969924927s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.8( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.195512772s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.955230713s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.8( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.195491791s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.955230713s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.e( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210190773s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.969924927s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.c( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210109711s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.969818115s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.3( v 66'51 (0'0,66'51] local-lis/les=65/66 n=1 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.195200920s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 66'50 mlcod 66'50 active pruub 217.955062866s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.3( v 66'51 (0'0,66'51] local-lis/les=65/66 n=1 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.195178032s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 66'50 mlcod 0'0 unknown NOTIFY pruub 217.955062866s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.4( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.195085526s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.955047607s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.4( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.195068359s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.955047607s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.2( v 60'46 (0'0,60'46] local-lis/les=67/68 n=1 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209959984s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.969940186s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.2( v 60'46 (0'0,60'46] local-lis/les=67/68 n=1 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209941864s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.969940186s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.3( v 60'46 (0'0,60'46] local-lis/les=67/68 n=1 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209921837s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.970046997s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.3( v 60'46 (0'0,60'46] local-lis/les=67/68 n=1 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209904671s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.970046997s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.5( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.194754601s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.955001831s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.11( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209153175s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.969436646s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.11( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209136009s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.969436646s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.5( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.194731712s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.955001831s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.18( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.194509506s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.954986572s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.18( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.194488525s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.954986572s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.1e( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209237099s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.969985962s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.1e( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209216118s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.969985962s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.19( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.194181442s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.954910278s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.1c( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209335327s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.970230103s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.1c( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209313393s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.970230103s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.1a( v 68'49 (0'0,68'49] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209171295s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 68'48 mlcod 68'48 active pruub 219.970169067s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.1a( v 68'49 (0'0,68'49] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209129333s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 68'48 mlcod 0'0 unknown NOTIFY pruub 219.970169067s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.1e( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.193212509s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.954437256s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.18( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209067345s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.970306396s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.18( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209049225s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.970306396s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.1e( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.193169594s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.954437256s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.19( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.208838463s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.970321655s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.19( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.208822250s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.970321655s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.10( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.193565369s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.955093384s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.19( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.193413734s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.954910278s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.11( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.192659378s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.954299927s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.11( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.192640305s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.954299927s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.17( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.208807945s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.970520020s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.17( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.208787918s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.970520020s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.12( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.192235947s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.953994751s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.12( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.192203522s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.953994751s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.1d( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.208681107s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.970489502s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.10( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.193547249s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.955093384s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.1d( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.208670616s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.970489502s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.1b( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.154850006s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.916732788s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.1b( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.154829025s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.916732788s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:43:49 compute-1 ceph-mon[80107]: 10.1f deep-scrub starts
Jan 26 09:43:49 compute-1 ceph-mon[80107]: 10.1f deep-scrub ok
Jan 26 09:43:49 compute-1 ceph-mon[80107]: 10.7 scrub starts
Jan 26 09:43:49 compute-1 ceph-mon[80107]: 10.7 scrub ok
Jan 26 09:43:49 compute-1 ceph-mon[80107]: 9.11 scrub starts
Jan 26 09:43:49 compute-1 ceph-mon[80107]: 9.11 scrub ok
Jan 26 09:43:49 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 09:43:49 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 09:43:49 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 09:43:49 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Jan 26 09:43:49 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[8.12( empty local-lis/les=0/0 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[11.12( empty local-lis/les=0/0 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[8.10( empty local-lis/les=0/0 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[11.7( empty local-lis/les=0/0 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[8.4( empty local-lis/les=0/0 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[11.4( empty local-lis/les=0/0 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[11.14( empty local-lis/les=0/0 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[8.17( empty local-lis/les=0/0 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[11.5( empty local-lis/les=0/0 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[8.8( empty local-lis/les=0/0 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[11.f( empty local-lis/les=0/0 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[11.1( empty local-lis/les=0/0 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[11.1e( empty local-lis/les=0/0 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[11.1d( empty local-lis/les=0/0 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[11.1c( empty local-lis/les=0/0 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[8.18( empty local-lis/les=0/0 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[11.1b( empty local-lis/les=0/0 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[11.1a( empty local-lis/les=0/0 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[8.19( empty local-lis/les=0/0 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[8.1b( empty local-lis/les=0/0 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:49 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[8.14( empty local-lis/les=0/0 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:50 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.16 scrub starts
Jan 26 09:43:50 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.16 scrub ok
Jan 26 09:43:50 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e69 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:43:50 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:50 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:50 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:50 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1480000d00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:50 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:50 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:50 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e70 e70: 3 total, 3 up, 3 in
Jan 26 09:43:50 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[11.1a( empty local-lis/les=69/70 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:50 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[8.10( v 47'9 (0'0,47'9] local-lis/les=69/70 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=47'9 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:50 compute-1 ceph-mon[80107]: pgmap v51: 353 pgs: 353 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:43:50 compute-1 ceph-mon[80107]: 10.1a scrub starts
Jan 26 09:43:50 compute-1 ceph-mon[80107]: 10.1a scrub ok
Jan 26 09:43:50 compute-1 ceph-mon[80107]: 9.5 scrub starts
Jan 26 09:43:50 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 09:43:50 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 09:43:50 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 09:43:50 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Jan 26 09:43:50 compute-1 ceph-mon[80107]: 9.5 scrub ok
Jan 26 09:43:50 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 09:43:50 compute-1 ceph-mon[80107]: osdmap e69: 3 total, 3 up, 3 in
Jan 26 09:43:50 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[11.1e( empty local-lis/les=69/70 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:50 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[8.19( v 47'9 lc 0'0 (0'0,47'9] local-lis/les=69/70 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=47'9 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:50 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[8.12( v 47'9 (0'0,47'9] local-lis/les=69/70 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=47'9 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:50 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[11.1c( empty local-lis/les=69/70 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:50 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[8.1b( v 47'9 (0'0,47'9] local-lis/les=69/70 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=47'9 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:50 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[11.1b( empty local-lis/les=69/70 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:50 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[11.7( empty local-lis/les=69/70 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:50 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[11.4( empty local-lis/les=69/70 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:50 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[8.4( v 47'9 (0'0,47'9] local-lis/les=69/70 n=1 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=47'9 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:50 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[11.5( empty local-lis/les=69/70 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:50 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[8.18( v 47'9 (0'0,47'9] local-lis/les=69/70 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=47'9 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:50 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[11.1d( empty local-lis/les=69/70 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:50 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[11.1( empty local-lis/les=69/70 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:50 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[8.8( v 47'9 (0'0,47'9] local-lis/les=69/70 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=47'9 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:50 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[11.f( empty local-lis/les=69/70 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:50 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[8.17( v 47'9 (0'0,47'9] local-lis/les=69/70 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=47'9 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:50 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[11.14( empty local-lis/les=69/70 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:50 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[8.14( v 47'9 (0'0,47'9] local-lis/les=69/70 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=47'9 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:50 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[11.12( empty local-lis/les=69/70 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:43:51 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.17 scrub starts
Jan 26 09:43:51 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.17 scrub ok
Jan 26 09:43:51 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e71 e71: 3 total, 3 up, 3 in
Jan 26 09:43:51 compute-1 ceph-mon[80107]: 10.16 scrub starts
Jan 26 09:43:51 compute-1 ceph-mon[80107]: 10.16 scrub ok
Jan 26 09:43:51 compute-1 ceph-mon[80107]: 11.10 scrub starts
Jan 26 09:43:51 compute-1 ceph-mon[80107]: 11.10 scrub ok
Jan 26 09:43:51 compute-1 ceph-mon[80107]: pgmap v53: 353 pgs: 353 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:43:51 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Jan 26 09:43:51 compute-1 ceph-mon[80107]: osdmap e70: 3 total, 3 up, 3 in
Jan 26 09:43:51 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Jan 26 09:43:51 compute-1 ceph-mon[80107]: osdmap e71: 3 total, 3 up, 3 in
Jan 26 09:43:52 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.15 scrub starts
Jan 26 09:43:52 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.15 scrub ok
Jan 26 09:43:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:52 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14780016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:52 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:52 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:52 compute-1 ceph-mon[80107]: 10.17 scrub starts
Jan 26 09:43:52 compute-1 ceph-mon[80107]: 10.17 scrub ok
Jan 26 09:43:52 compute-1 ceph-mon[80107]: 9.12 scrub starts
Jan 26 09:43:52 compute-1 ceph-mon[80107]: 9.12 scrub ok
Jan 26 09:43:52 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:52 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:52 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:52 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:52 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:52 compute-1 ceph-mon[80107]: Deploying daemon haproxy.rgw.default.compute-0.ovxbdp on compute-0
Jan 26 09:43:52 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Jan 26 09:43:53 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.0 scrub starts
Jan 26 09:43:53 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.0 scrub ok
Jan 26 09:43:53 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e72 e72: 3 total, 3 up, 3 in
Jan 26 09:43:54 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.e scrub starts
Jan 26 09:43:54 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.e scrub ok
Jan 26 09:43:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:54 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:54 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14780016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:54 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14780016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:54 compute-1 ceph-mon[80107]: 12.15 scrub starts
Jan 26 09:43:54 compute-1 ceph-mon[80107]: 12.15 scrub ok
Jan 26 09:43:54 compute-1 ceph-mon[80107]: 8.6 scrub starts
Jan 26 09:43:54 compute-1 ceph-mon[80107]: 8.6 scrub ok
Jan 26 09:43:54 compute-1 ceph-mon[80107]: 11.11 scrub starts
Jan 26 09:43:54 compute-1 ceph-mon[80107]: 11.11 scrub ok
Jan 26 09:43:54 compute-1 ceph-mon[80107]: pgmap v56: 353 pgs: 353 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:43:54 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Jan 26 09:43:54 compute-1 ceph-mon[80107]: osdmap e72: 3 total, 3 up, 3 in
Jan 26 09:43:54 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e73 e73: 3 total, 3 up, 3 in
Jan 26 09:43:55 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.c scrub starts
Jan 26 09:43:55 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.c scrub ok
Jan 26 09:43:55 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e73 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:43:56 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.a scrub starts
Jan 26 09:43:56 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.a scrub ok
Jan 26 09:43:56 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e74 e74: 3 total, 3 up, 3 in
Jan 26 09:43:56 compute-1 ceph-mon[80107]: 10.0 scrub starts
Jan 26 09:43:56 compute-1 ceph-mon[80107]: 10.0 scrub ok
Jan 26 09:43:56 compute-1 ceph-mon[80107]: 8.f scrub starts
Jan 26 09:43:56 compute-1 ceph-mon[80107]: 8.f scrub ok
Jan 26 09:43:56 compute-1 ceph-mon[80107]: 8.7 scrub starts
Jan 26 09:43:56 compute-1 ceph-mon[80107]: 8.7 scrub ok
Jan 26 09:43:56 compute-1 ceph-mon[80107]: 10.e scrub starts
Jan 26 09:43:56 compute-1 ceph-mon[80107]: 10.e scrub ok
Jan 26 09:43:56 compute-1 ceph-mon[80107]: 12.7 scrub starts
Jan 26 09:43:56 compute-1 ceph-mon[80107]: 12.7 scrub ok
Jan 26 09:43:56 compute-1 ceph-mon[80107]: 11.9 scrub starts
Jan 26 09:43:56 compute-1 ceph-mon[80107]: pgmap v58: 353 pgs: 353 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 116 B/s, 0 keys/s, 2 objects/s recovering
Jan 26 09:43:56 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Jan 26 09:43:56 compute-1 ceph-mon[80107]: 11.9 scrub ok
Jan 26 09:43:56 compute-1 ceph-mon[80107]: osdmap e73: 3 total, 3 up, 3 in
Jan 26 09:43:56 compute-1 ceph-mon[80107]: 10.c scrub starts
Jan 26 09:43:56 compute-1 ceph-mon[80107]: 10.c scrub ok
Jan 26 09:43:56 compute-1 ceph-mon[80107]: 11.15 scrub starts
Jan 26 09:43:56 compute-1 ceph-mon[80107]: 11.15 scrub ok
Jan 26 09:43:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:43:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.003000080s ======
Jan 26 09:43:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:43:56.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000080s
Jan 26 09:43:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:56 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1480001820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:56 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:56 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:57 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.9 scrub starts
Jan 26 09:43:57 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.9 scrub ok
Jan 26 09:43:57 compute-1 ceph-mon[80107]: 10.3 scrub starts
Jan 26 09:43:57 compute-1 ceph-mon[80107]: 10.3 scrub ok
Jan 26 09:43:57 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:57 compute-1 ceph-mon[80107]: 10.a scrub starts
Jan 26 09:43:57 compute-1 ceph-mon[80107]: 10.a scrub ok
Jan 26 09:43:57 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Jan 26 09:43:57 compute-1 ceph-mon[80107]: osdmap e74: 3 total, 3 up, 3 in
Jan 26 09:43:57 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:57 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:57 compute-1 ceph-mon[80107]: Deploying daemon haproxy.rgw.default.compute-2.yyinob on compute-2
Jan 26 09:43:57 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Jan 26 09:43:57 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e75 e75: 3 total, 3 up, 3 in
Jan 26 09:43:58 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.f scrub starts
Jan 26 09:43:58 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.f scrub ok
Jan 26 09:43:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:43:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:43:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:43:58.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:43:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:58 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14780016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:58 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1480002140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:43:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:43:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:43:58.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:43:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:58 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:43:58 compute-1 ceph-mon[80107]: pgmap v61: 353 pgs: 353 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 121 B/s, 0 keys/s, 2 objects/s recovering
Jan 26 09:43:58 compute-1 ceph-mon[80107]: 10.9 scrub starts
Jan 26 09:43:58 compute-1 ceph-mon[80107]: 10.9 scrub ok
Jan 26 09:43:58 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Jan 26 09:43:58 compute-1 ceph-mon[80107]: osdmap e75: 3 total, 3 up, 3 in
Jan 26 09:43:58 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:58 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:58 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:58 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:43:58 compute-1 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 26 09:43:58 compute-1 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 26 09:43:58 compute-1 ceph-mon[80107]: Deploying daemon keepalived.rgw.default.compute-0.dhkprh on compute-0
Jan 26 09:43:58 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Jan 26 09:43:58 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.d deep-scrub starts
Jan 26 09:43:59 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.d deep-scrub ok
Jan 26 09:43:59 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e76 e76: 3 total, 3 up, 3 in
Jan 26 09:43:59 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 76 pg[9.6( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=76) [1] r=0 lpr=76 pi=[63,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:59 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 76 pg[9.16( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=76) [1] r=0 lpr=76 pi=[63,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:59 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 76 pg[9.e( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=76) [1] r=0 lpr=76 pi=[63,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:59 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 76 pg[9.1e( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=76) [1] r=0 lpr=76 pi=[63,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:43:59 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.b scrub starts
Jan 26 09:43:59 compute-1 ceph-mon[80107]: 12.f scrub starts
Jan 26 09:43:59 compute-1 ceph-mon[80107]: 12.f scrub ok
Jan 26 09:43:59 compute-1 ceph-mon[80107]: 9.a deep-scrub starts
Jan 26 09:43:59 compute-1 ceph-mon[80107]: 9.a deep-scrub ok
Jan 26 09:43:59 compute-1 ceph-mon[80107]: pgmap v63: 353 pgs: 8 active+remapped, 345 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 209 B/s, 6 objects/s recovering
Jan 26 09:43:59 compute-1 ceph-mon[80107]: 10.d deep-scrub starts
Jan 26 09:43:59 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Jan 26 09:43:59 compute-1 ceph-mon[80107]: osdmap e76: 3 total, 3 up, 3 in
Jan 26 09:43:59 compute-1 ceph-mon[80107]: 9.8 scrub starts
Jan 26 09:43:59 compute-1 ceph-mon[80107]: 9.8 scrub ok
Jan 26 09:43:59 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.b scrub ok
Jan 26 09:44:00 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e77 e77: 3 total, 3 up, 3 in
Jan 26 09:44:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 77 pg[9.16( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=77) [1]/[0] r=-1 lpr=77 pi=[63,77)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 77 pg[9.e( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=77) [1]/[0] r=-1 lpr=77 pi=[63,77)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 77 pg[9.e( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=77) [1]/[0] r=-1 lpr=77 pi=[63,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 09:44:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 77 pg[9.16( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=77) [1]/[0] r=-1 lpr=77 pi=[63,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 09:44:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 77 pg[9.6( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=77) [1]/[0] r=-1 lpr=77 pi=[63,77)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 77 pg[9.6( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=77) [1]/[0] r=-1 lpr=77 pi=[63,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 09:44:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 77 pg[9.1e( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=77) [1]/[0] r=-1 lpr=77 pi=[63,77)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:00 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 77 pg[9.1e( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=77) [1]/[0] r=-1 lpr=77 pi=[63,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 09:44:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:44:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:00.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:44:00 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e77 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:44:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:00 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:00 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478002f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:44:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:00.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:44:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:00 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1480002140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:00 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.d deep-scrub starts
Jan 26 09:44:00 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.d deep-scrub ok
Jan 26 09:44:01 compute-1 ceph-mon[80107]: 10.d deep-scrub ok
Jan 26 09:44:01 compute-1 ceph-mon[80107]: 9.13 scrub starts
Jan 26 09:44:01 compute-1 ceph-mon[80107]: 9.13 scrub ok
Jan 26 09:44:01 compute-1 ceph-mon[80107]: 10.b scrub starts
Jan 26 09:44:01 compute-1 ceph-mon[80107]: 10.b scrub ok
Jan 26 09:44:01 compute-1 ceph-mon[80107]: osdmap e77: 3 total, 3 up, 3 in
Jan 26 09:44:01 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:01 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:01 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:01 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Jan 26 09:44:01 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e78 e78: 3 total, 3 up, 3 in
Jan 26 09:44:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:02.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:02 compute-1 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 26 09:44:02 compute-1 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 26 09:44:02 compute-1 ceph-mon[80107]: Deploying daemon keepalived.rgw.default.compute-2.djgvpg on compute-2
Jan 26 09:44:02 compute-1 ceph-mon[80107]: pgmap v66: 353 pgs: 8 active+remapped, 345 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 253 B/s, 7 objects/s recovering
Jan 26 09:44:02 compute-1 ceph-mon[80107]: 12.d deep-scrub starts
Jan 26 09:44:02 compute-1 ceph-mon[80107]: 12.d deep-scrub ok
Jan 26 09:44:02 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Jan 26 09:44:02 compute-1 ceph-mon[80107]: osdmap e78: 3 total, 3 up, 3 in
Jan 26 09:44:02 compute-1 ceph-mon[80107]: 9.1f scrub starts
Jan 26 09:44:02 compute-1 ceph-mon[80107]: 9.4 scrub starts
Jan 26 09:44:02 compute-1 ceph-mon[80107]: 9.1f scrub ok
Jan 26 09:44:02 compute-1 ceph-mon[80107]: 9.4 scrub ok
Jan 26 09:44:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e79 e79: 3 total, 3 up, 3 in
Jan 26 09:44:02 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 79 pg[9.6( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=6 ec=63/48 lis/c=77/63 les/c/f=78/65/0 sis=79) [1] r=0 lpr=79 pi=[63,79)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:02 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 79 pg[9.e( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=6 ec=63/48 lis/c=77/63 les/c/f=78/65/0 sis=79) [1] r=0 lpr=79 pi=[63,79)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:02 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 79 pg[9.e( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=6 ec=63/48 lis/c=77/63 les/c/f=78/65/0 sis=79) [1] r=0 lpr=79 pi=[63,79)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:44:02 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 79 pg[9.16( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=5 ec=63/48 lis/c=77/63 les/c/f=78/65/0 sis=79) [1] r=0 lpr=79 pi=[63,79)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:02 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 79 pg[9.6( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=6 ec=63/48 lis/c=77/63 les/c/f=78/65/0 sis=79) [1] r=0 lpr=79 pi=[63,79)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:44:02 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 79 pg[9.16( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=5 ec=63/48 lis/c=77/63 les/c/f=78/65/0 sis=79) [1] r=0 lpr=79 pi=[63,79)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:44:02 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 79 pg[9.1e( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=5 ec=63/48 lis/c=77/63 les/c/f=78/65/0 sis=79) [1] r=0 lpr=79 pi=[63,79)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:02 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 79 pg[9.1e( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=5 ec=63/48 lis/c=77/63 les/c/f=78/65/0 sis=79) [1] r=0 lpr=79 pi=[63,79)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:44:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:44:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:02.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:44:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478002f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:03 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e80 e80: 3 total, 3 up, 3 in
Jan 26 09:44:04 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 80 pg[9.6( v 60'1159 (0'0,60'1159] local-lis/les=79/80 n=6 ec=63/48 lis/c=77/63 les/c/f=78/65/0 sis=79) [1] r=0 lpr=79 pi=[63,79)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:44:04 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 80 pg[9.e( v 60'1159 (0'0,60'1159] local-lis/les=79/80 n=6 ec=63/48 lis/c=77/63 les/c/f=78/65/0 sis=79) [1] r=0 lpr=79 pi=[63,79)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:44:04 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 80 pg[9.16( v 60'1159 (0'0,60'1159] local-lis/les=79/80 n=5 ec=63/48 lis/c=77/63 les/c/f=78/65/0 sis=79) [1] r=0 lpr=79 pi=[63,79)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:44:04 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 80 pg[9.1e( v 60'1159 (0'0,60'1159] local-lis/les=79/80 n=5 ec=63/48 lis/c=77/63 les/c/f=78/65/0 sis=79) [1] r=0 lpr=79 pi=[63,79)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:44:04 compute-1 ceph-mon[80107]: 11.6 scrub starts
Jan 26 09:44:04 compute-1 ceph-mon[80107]: 11.6 scrub ok
Jan 26 09:44:04 compute-1 ceph-mon[80107]: 11.a scrub starts
Jan 26 09:44:04 compute-1 ceph-mon[80107]: 11.a scrub ok
Jan 26 09:44:04 compute-1 ceph-mon[80107]: osdmap e79: 3 total, 3 up, 3 in
Jan 26 09:44:04 compute-1 ceph-mon[80107]: pgmap v69: 353 pgs: 8 active+remapped, 345 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:44:04 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Jan 26 09:44:04 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:04 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:04 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:04 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:04 compute-1 ceph-mon[80107]: Deploying daemon prometheus.compute-0 on compute-0
Jan 26 09:44:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:04.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:04 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1480002140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:04 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:04.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:04 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:04 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.e scrub starts
Jan 26 09:44:05 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.e scrub ok
Jan 26 09:44:05 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e81 e81: 3 total, 3 up, 3 in
Jan 26 09:44:05 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:44:05 compute-1 ceph-mon[80107]: 11.b scrub starts
Jan 26 09:44:05 compute-1 ceph-mon[80107]: 11.b scrub ok
Jan 26 09:44:05 compute-1 ceph-mon[80107]: 8.c scrub starts
Jan 26 09:44:05 compute-1 ceph-mon[80107]: 8.c scrub ok
Jan 26 09:44:05 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Jan 26 09:44:05 compute-1 ceph-mon[80107]: osdmap e80: 3 total, 3 up, 3 in
Jan 26 09:44:05 compute-1 ceph-mon[80107]: 11.c scrub starts
Jan 26 09:44:05 compute-1 ceph-mon[80107]: 11.c scrub ok
Jan 26 09:44:05 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.6 scrub starts
Jan 26 09:44:05 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.6 scrub ok
Jan 26 09:44:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:06.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:06 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e82 e82: 3 total, 3 up, 3 in
Jan 26 09:44:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:06 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:06 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1480003490 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:44:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:06.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:44:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:06 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:06 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.16 scrub starts
Jan 26 09:44:06 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.16 scrub ok
Jan 26 09:44:07 compute-1 ceph-mon[80107]: 10.10 scrub starts
Jan 26 09:44:07 compute-1 ceph-mon[80107]: 10.10 scrub ok
Jan 26 09:44:07 compute-1 ceph-mon[80107]: pgmap v71: 353 pgs: 5 peering, 348 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 129 B/s, 7 objects/s recovering
Jan 26 09:44:07 compute-1 ceph-mon[80107]: 9.e scrub starts
Jan 26 09:44:07 compute-1 ceph-mon[80107]: 9.e scrub ok
Jan 26 09:44:07 compute-1 ceph-mon[80107]: osdmap e81: 3 total, 3 up, 3 in
Jan 26 09:44:07 compute-1 ceph-mon[80107]: 11.d scrub starts
Jan 26 09:44:07 compute-1 ceph-mon[80107]: 11.d scrub ok
Jan 26 09:44:07 compute-1 ceph-mon[80107]: 9.6 scrub starts
Jan 26 09:44:07 compute-1 ceph-mon[80107]: 9.6 scrub ok
Jan 26 09:44:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e83 e83: 3 total, 3 up, 3 in
Jan 26 09:44:07 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.5 scrub starts
Jan 26 09:44:07 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.5 scrub ok
Jan 26 09:44:08 compute-1 ceph-mon[80107]: 12.17 scrub starts
Jan 26 09:44:08 compute-1 ceph-mon[80107]: 12.17 scrub ok
Jan 26 09:44:08 compute-1 ceph-mon[80107]: 8.e scrub starts
Jan 26 09:44:08 compute-1 ceph-mon[80107]: 8.e scrub ok
Jan 26 09:44:08 compute-1 ceph-mon[80107]: 10.1e scrub starts
Jan 26 09:44:08 compute-1 ceph-mon[80107]: 10.1e scrub ok
Jan 26 09:44:08 compute-1 ceph-mon[80107]: pgmap v73: 353 pgs: 5 peering, 348 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 113 B/s, 6 objects/s recovering
Jan 26 09:44:08 compute-1 ceph-mon[80107]: 9.16 scrub starts
Jan 26 09:44:08 compute-1 ceph-mon[80107]: 9.16 scrub ok
Jan 26 09:44:08 compute-1 ceph-mon[80107]: osdmap e82: 3 total, 3 up, 3 in
Jan 26 09:44:08 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:08 compute-1 ceph-mon[80107]: 8.0 scrub starts
Jan 26 09:44:08 compute-1 ceph-mon[80107]: 8.0 scrub ok
Jan 26 09:44:08 compute-1 ceph-mon[80107]: 12.5 scrub starts
Jan 26 09:44:08 compute-1 ceph-mon[80107]: 12.5 scrub ok
Jan 26 09:44:08 compute-1 ceph-mon[80107]: osdmap e83: 3 total, 3 up, 3 in
Jan 26 09:44:08 compute-1 sshd-session[87025]: Connection closed by authenticating user root 178.62.249.31 port 39856 [preauth]
Jan 26 09:44:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:44:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:08.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:44:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:08 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1480003490 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:08 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:08.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:08 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:08 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e84 e84: 3 total, 3 up, 3 in
Jan 26 09:44:08 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.0 scrub starts
Jan 26 09:44:08 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.0 scrub ok
Jan 26 09:44:09 compute-1 ceph-mon[80107]: 11.13 deep-scrub starts
Jan 26 09:44:09 compute-1 ceph-mon[80107]: 11.13 deep-scrub ok
Jan 26 09:44:09 compute-1 ceph-mon[80107]: 11.2 scrub starts
Jan 26 09:44:09 compute-1 ceph-mon[80107]: 11.2 scrub ok
Jan 26 09:44:09 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Jan 26 09:44:09 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Jan 26 09:44:09 compute-1 ceph-mon[80107]: osdmap e84: 3 total, 3 up, 3 in
Jan 26 09:44:09 compute-1 ceph-mon[80107]: 12.0 scrub starts
Jan 26 09:44:09 compute-1 ceph-mon[80107]: 12.0 scrub ok
Jan 26 09:44:09 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.6 scrub starts
Jan 26 09:44:09 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.6 scrub ok
Jan 26 09:44:10 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e85 e85: 3 total, 3 up, 3 in
Jan 26 09:44:10 compute-1 ceph-mon[80107]: pgmap v76: 353 pgs: 2 active+remapped, 351 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:44:10 compute-1 ceph-mon[80107]: 8.1f scrub starts
Jan 26 09:44:10 compute-1 ceph-mon[80107]: 8.1f scrub ok
Jan 26 09:44:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:10.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:10 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:44:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:10 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:10 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1480003490 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 09:44:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:10.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 09:44:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:10 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1480003490 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:10 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.1f deep-scrub starts
Jan 26 09:44:10 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.1f deep-scrub ok
Jan 26 09:44:11 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e86 e86: 3 total, 3 up, 3 in
Jan 26 09:44:11 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 86 pg[9.a( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=86) [1] r=0 lpr=86 pi=[63,86)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:44:11 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 86 pg[9.1a( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=86) [1] r=0 lpr=86 pi=[63,86)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:44:11 compute-1 ceph-mon[80107]: 8.1 scrub starts
Jan 26 09:44:11 compute-1 ceph-mon[80107]: 8.1 scrub ok
Jan 26 09:44:11 compute-1 ceph-mon[80107]: 12.1d scrub starts
Jan 26 09:44:11 compute-1 ceph-mon[80107]: 12.1d scrub ok
Jan 26 09:44:11 compute-1 ceph-mon[80107]: 10.6 scrub starts
Jan 26 09:44:11 compute-1 ceph-mon[80107]: 10.6 scrub ok
Jan 26 09:44:11 compute-1 ceph-mon[80107]: osdmap e85: 3 total, 3 up, 3 in
Jan 26 09:44:11 compute-1 ceph-mon[80107]: 11.0 scrub starts
Jan 26 09:44:11 compute-1 ceph-mon[80107]: 11.0 scrub ok
Jan 26 09:44:11 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:11 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:11 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:11 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch
Jan 26 09:44:11 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Jan 26 09:44:11 compute-1 ceph-mon[80107]: 12.1f deep-scrub starts
Jan 26 09:44:11 compute-1 ceph-mon[80107]: 12.1f deep-scrub ok
Jan 26 09:44:11 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.1c scrub starts
Jan 26 09:44:11 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.1c scrub ok
Jan 26 09:44:12 compute-1 ceph-mgr[80416]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 26 09:44:12 compute-1 ceph-mgr[80416]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 26 09:44:12 compute-1 sshd-session[84082]: Connection closed by 192.168.122.100 port 49156
Jan 26 09:44:12 compute-1 sshd-session[84050]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 26 09:44:12 compute-1 systemd[1]: session-34.scope: Deactivated successfully.
Jan 26 09:44:12 compute-1 systemd[1]: session-34.scope: Consumed 17.799s CPU time.
Jan 26 09:44:12 compute-1 systemd-logind[783]: Session 34 logged out. Waiting for processes to exit.
Jan 26 09:44:12 compute-1 systemd-logind[783]: Removed session 34.
Jan 26 09:44:12 compute-1 sshd-session[87030]: Invalid user admin from 104.248.198.71 port 45068
Jan 26 09:44:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: ignoring --setuser ceph since I am not root
Jan 26 09:44:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: ignoring --setgroup ceph since I am not root
Jan 26 09:44:12 compute-1 ceph-mgr[80416]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 26 09:44:12 compute-1 ceph-mgr[80416]: pidfile_write: ignore empty --pid-file
Jan 26 09:44:12 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'alerts'
Jan 26 09:44:12 compute-1 sshd-session[87030]: Connection closed by invalid user admin 104.248.198.71 port 45068 [preauth]
Jan 26 09:44:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:12.237+0000 7f00d41c2140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 26 09:44:12 compute-1 ceph-mgr[80416]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 26 09:44:12 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'balancer'
Jan 26 09:44:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:12.314+0000 7f00d41c2140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 26 09:44:12 compute-1 ceph-mgr[80416]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 26 09:44:12 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'cephadm'
Jan 26 09:44:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e87 e87: 3 total, 3 up, 3 in
Jan 26 09:44:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 09:44:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:12.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 09:44:12 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 87 pg[9.a( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=87) [1]/[0] r=-1 lpr=87 pi=[63,87)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:12 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 87 pg[9.a( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=87) [1]/[0] r=-1 lpr=87 pi=[63,87)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 09:44:12 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 87 pg[9.1a( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=87) [1]/[0] r=-1 lpr=87 pi=[63,87)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:12 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 87 pg[9.1a( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=87) [1]/[0] r=-1 lpr=87 pi=[63,87)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 09:44:12 compute-1 ceph-mon[80107]: pgmap v79: 353 pgs: 2 active+remapped, 351 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:44:12 compute-1 ceph-mon[80107]: 11.8 scrub starts
Jan 26 09:44:12 compute-1 ceph-mon[80107]: 11.8 scrub ok
Jan 26 09:44:12 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Jan 26 09:44:12 compute-1 ceph-mon[80107]: osdmap e86: 3 total, 3 up, 3 in
Jan 26 09:44:12 compute-1 ceph-mon[80107]: 11.1f scrub starts
Jan 26 09:44:12 compute-1 ceph-mon[80107]: 11.1f scrub ok
Jan 26 09:44:12 compute-1 ceph-mon[80107]: 11.17 scrub starts
Jan 26 09:44:12 compute-1 ceph-mon[80107]: 11.17 scrub ok
Jan 26 09:44:12 compute-1 ceph-mon[80107]: 10.1c scrub starts
Jan 26 09:44:12 compute-1 ceph-mon[80107]: 10.1c scrub ok
Jan 26 09:44:12 compute-1 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished
Jan 26 09:44:12 compute-1 ceph-mon[80107]: mgrmap e27: compute-0.zllcia(active, since 95s), standbys: compute-1.xammti, compute-2.oynaeu
Jan 26 09:44:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:12 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478003c10 fd 15 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:12 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478003c10 fd 15 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 09:44:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:12.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 09:44:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:12 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 15 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:12 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.1d scrub starts
Jan 26 09:44:12 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.1d scrub ok
Jan 26 09:44:13 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'crash'
Jan 26 09:44:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:13.201+0000 7f00d41c2140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 26 09:44:13 compute-1 ceph-mgr[80416]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 26 09:44:13 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'dashboard'
Jan 26 09:44:13 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e88 e88: 3 total, 3 up, 3 in
Jan 26 09:44:13 compute-1 ceph-mon[80107]: osdmap e87: 3 total, 3 up, 3 in
Jan 26 09:44:13 compute-1 ceph-mon[80107]: 8.1d scrub starts
Jan 26 09:44:13 compute-1 ceph-mon[80107]: 8.1d scrub ok
Jan 26 09:44:13 compute-1 ceph-mon[80107]: 8.5 scrub starts
Jan 26 09:44:13 compute-1 ceph-mon[80107]: 8.5 scrub ok
Jan 26 09:44:13 compute-1 ceph-mon[80107]: 10.1d scrub starts
Jan 26 09:44:13 compute-1 ceph-mon[80107]: 10.1d scrub ok
Jan 26 09:44:13 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'devicehealth'
Jan 26 09:44:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:13.844+0000 7f00d41c2140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 26 09:44:13 compute-1 ceph-mgr[80416]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 26 09:44:13 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'diskprediction_local'
Jan 26 09:44:13 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.1b scrub starts
Jan 26 09:44:13 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.1b scrub ok
Jan 26 09:44:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 26 09:44:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 26 09:44:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]:   from numpy import show_config as show_numpy_config
Jan 26 09:44:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:14.018+0000 7f00d41c2140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 26 09:44:14 compute-1 ceph-mgr[80416]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 26 09:44:14 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'influx'
Jan 26 09:44:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:14.085+0000 7f00d41c2140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 26 09:44:14 compute-1 ceph-mgr[80416]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 26 09:44:14 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'insights'
Jan 26 09:44:14 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'iostat'
Jan 26 09:44:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:14.229+0000 7f00d41c2140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 26 09:44:14 compute-1 ceph-mgr[80416]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 26 09:44:14 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'k8sevents'
Jan 26 09:44:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 09:44:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:14.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 09:44:14 compute-1 ceph-mon[80107]: osdmap e88: 3 total, 3 up, 3 in
Jan 26 09:44:14 compute-1 ceph-mon[80107]: 12.1e scrub starts
Jan 26 09:44:14 compute-1 ceph-mon[80107]: 12.1e scrub ok
Jan 26 09:44:14 compute-1 ceph-mon[80107]: 12.1b scrub starts
Jan 26 09:44:14 compute-1 ceph-mon[80107]: 12.1b scrub ok
Jan 26 09:44:14 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e89 e89: 3 total, 3 up, 3 in
Jan 26 09:44:14 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 89 pg[9.a( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=9 ec=63/48 lis/c=87/63 les/c/f=88/65/0 sis=89) [1] r=0 lpr=89 pi=[63,89)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:14 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 89 pg[9.a( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=9 ec=63/48 lis/c=87/63 les/c/f=88/65/0 sis=89) [1] r=0 lpr=89 pi=[63,89)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:44:14 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 89 pg[9.1a( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=5 ec=63/48 lis/c=87/63 les/c/f=88/65/0 sis=89) [1] r=0 lpr=89 pi=[63,89)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:14 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 89 pg[9.1a( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=5 ec=63/48 lis/c=87/63 les/c/f=88/65/0 sis=89) [1] r=0 lpr=89 pi=[63,89)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:44:14 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'localpool'
Jan 26 09:44:14 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'mds_autoscaler'
Jan 26 09:44:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:14 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1480003490 fd 15 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:14 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478003c10 fd 15 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:14.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:14 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478003c10 fd 15 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:14 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'mirroring'
Jan 26 09:44:14 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.16 scrub starts
Jan 26 09:44:14 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.16 scrub ok
Jan 26 09:44:15 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'nfs'
Jan 26 09:44:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:15.281+0000 7f00d41c2140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 26 09:44:15 compute-1 ceph-mgr[80416]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 26 09:44:15 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'orchestrator'
Jan 26 09:44:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:15.511+0000 7f00d41c2140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 26 09:44:15 compute-1 ceph-mgr[80416]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 26 09:44:15 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'osd_perf_query'
Jan 26 09:44:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:15.589+0000 7f00d41c2140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 26 09:44:15 compute-1 ceph-mgr[80416]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 26 09:44:15 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'osd_support'
Jan 26 09:44:15 compute-1 ceph-mon[80107]: osdmap e89: 3 total, 3 up, 3 in
Jan 26 09:44:15 compute-1 ceph-mon[80107]: 12.2 scrub starts
Jan 26 09:44:15 compute-1 ceph-mon[80107]: 12.2 scrub ok
Jan 26 09:44:15 compute-1 ceph-mon[80107]: 12.16 scrub starts
Jan 26 09:44:15 compute-1 ceph-mon[80107]: 12.16 scrub ok
Jan 26 09:44:15 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:44:15 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e90 e90: 3 total, 3 up, 3 in
Jan 26 09:44:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:15.661+0000 7f00d41c2140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 26 09:44:15 compute-1 ceph-mgr[80416]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 26 09:44:15 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'pg_autoscaler'
Jan 26 09:44:15 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 90 pg[9.1a( v 60'1159 (0'0,60'1159] local-lis/les=89/90 n=5 ec=63/48 lis/c=87/63 les/c/f=88/65/0 sis=89) [1] r=0 lpr=89 pi=[63,89)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:44:15 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 90 pg[9.a( v 60'1159 (0'0,60'1159] local-lis/les=89/90 n=9 ec=63/48 lis/c=87/63 les/c/f=88/65/0 sis=89) [1] r=0 lpr=89 pi=[63,89)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:44:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:15.756+0000 7f00d41c2140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 26 09:44:15 compute-1 ceph-mgr[80416]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 26 09:44:15 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'progress'
Jan 26 09:44:15 compute-1 sshd-session[87064]: Connection closed by authenticating user root 152.42.136.110 port 36108 [preauth]
Jan 26 09:44:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:15.837+0000 7f00d41c2140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 26 09:44:15 compute-1 ceph-mgr[80416]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 26 09:44:15 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'prometheus'
Jan 26 09:44:15 compute-1 sshd-session[87067]: Accepted publickey for zuul from 192.168.122.30 port 42858 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:44:15 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.14 scrub starts
Jan 26 09:44:15 compute-1 systemd-logind[783]: New session 36 of user zuul.
Jan 26 09:44:15 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.14 scrub ok
Jan 26 09:44:15 compute-1 systemd[1]: Started Session 36 of User zuul.
Jan 26 09:44:15 compute-1 sshd-session[87067]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:44:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:16.207+0000 7f00d41c2140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 26 09:44:16 compute-1 ceph-mgr[80416]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 26 09:44:16 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'rbd_support'
Jan 26 09:44:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:16.311+0000 7f00d41c2140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 26 09:44:16 compute-1 ceph-mgr[80416]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 26 09:44:16 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'restful'
Jan 26 09:44:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:16.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:16 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'rgw'
Jan 26 09:44:16 compute-1 ceph-mon[80107]: osdmap e90: 3 total, 3 up, 3 in
Jan 26 09:44:16 compute-1 ceph-mon[80107]: 11.19 scrub starts
Jan 26 09:44:16 compute-1 ceph-mon[80107]: 11.19 scrub ok
Jan 26 09:44:16 compute-1 ceph-mon[80107]: 12.14 scrub starts
Jan 26 09:44:16 compute-1 ceph-mon[80107]: 12.14 scrub ok
Jan 26 09:44:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:16 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 15 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:16 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1480003490 fd 15 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:16.761+0000 7f00d41c2140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 26 09:44:16 compute-1 ceph-mgr[80416]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 26 09:44:16 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'rook'
Jan 26 09:44:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:44:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:16.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:44:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:16 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478003c10 fd 15 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:16 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.1 scrub starts
Jan 26 09:44:16 compute-1 python3.9[87220]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:44:16 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.1 scrub ok
Jan 26 09:44:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:17.368+0000 7f00d41c2140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 26 09:44:17 compute-1 ceph-mgr[80416]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 26 09:44:17 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'selftest'
Jan 26 09:44:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:17.442+0000 7f00d41c2140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 26 09:44:17 compute-1 ceph-mgr[80416]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 26 09:44:17 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'snap_schedule'
Jan 26 09:44:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:17.529+0000 7f00d41c2140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 26 09:44:17 compute-1 ceph-mgr[80416]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 26 09:44:17 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'stats'
Jan 26 09:44:17 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'status'
Jan 26 09:44:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:17.711+0000 7f00d41c2140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 26 09:44:17 compute-1 ceph-mgr[80416]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 26 09:44:17 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'telegraf'
Jan 26 09:44:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:17.796+0000 7f00d41c2140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 26 09:44:17 compute-1 ceph-mgr[80416]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 26 09:44:17 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'telemetry'
Jan 26 09:44:17 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.1a scrub starts
Jan 26 09:44:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:17.964+0000 7f00d41c2140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 26 09:44:17 compute-1 ceph-mgr[80416]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 26 09:44:17 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'test_orchestrator'
Jan 26 09:44:17 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.1a scrub ok
Jan 26 09:44:18 compute-1 ceph-mon[80107]: 10.f scrub starts
Jan 26 09:44:18 compute-1 ceph-mon[80107]: 10.f scrub ok
Jan 26 09:44:18 compute-1 ceph-mon[80107]: 12.1 scrub starts
Jan 26 09:44:18 compute-1 ceph-mon[80107]: 12.1 scrub ok
Jan 26 09:44:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:18.203+0000 7f00d41c2140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 26 09:44:18 compute-1 ceph-mgr[80416]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 26 09:44:18 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'volumes'
Jan 26 09:44:18 compute-1 sudo[87433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydxaepbvezeraselnpldhtosilhzmuux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420658.0287023-53-30109697104993/AnsiballZ_command.py'
Jan 26 09:44:18 compute-1 sudo[87433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:44:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 09:44:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:18.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 09:44:18 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e91 e91: 3 total, 3 up, 3 in
Jan 26 09:44:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:18.518+0000 7f00d41c2140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 26 09:44:18 compute-1 ceph-mgr[80416]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 26 09:44:18 compute-1 ceph-mgr[80416]: mgr[py] Loading python module 'zabbix'
Jan 26 09:44:18 compute-1 python3.9[87435]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:44:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:18.611+0000 7f00d41c2140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 26 09:44:18 compute-1 ceph-mgr[80416]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 26 09:44:18 compute-1 ceph-mgr[80416]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 26 09:44:18 compute-1 ceph-mgr[80416]: mgr load Constructed class from module: dashboard
Jan 26 09:44:18 compute-1 ceph-mgr[80416]: [prometheus DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 26 09:44:18 compute-1 ceph-mgr[80416]: mgr load Constructed class from module: prometheus
Jan 26 09:44:18 compute-1 ceph-mgr[80416]: ms_deliver_dispatch: unhandled message 0x55eef89e7860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Jan 26 09:44:18 compute-1 ceph-mgr[80416]: [dashboard INFO root] server: ssl=no host=192.168.122.101 port=8443
Jan 26 09:44:18 compute-1 ceph-mgr[80416]: [prometheus INFO root] server_addr: :: server_port: 9283
Jan 26 09:44:18 compute-1 ceph-mgr[80416]: [prometheus INFO root] Starting engine...
Jan 26 09:44:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: [26/Jan/2026:09:44:18] ENGINE Bus STARTING
Jan 26 09:44:18 compute-1 ceph-mgr[80416]: [dashboard INFO root] Configured CherryPy, starting engine...
Jan 26 09:44:18 compute-1 ceph-mgr[80416]: [prometheus INFO cherrypy.error] [26/Jan/2026:09:44:18] ENGINE Bus STARTING
Jan 26 09:44:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: CherryPy Checker:
Jan 26 09:44:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: The Application mounted at '' has an empty config.
Jan 26 09:44:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 
Jan 26 09:44:18 compute-1 ceph-mgr[80416]: [dashboard INFO root] Starting engine...
Jan 26 09:44:18 compute-1 ceph-mgr[80416]: [dashboard INFO root] Engine started...
Jan 26 09:44:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: [26/Jan/2026:09:44:18] ENGINE Serving on http://:::9283
Jan 26 09:44:18 compute-1 ceph-mgr[80416]: [prometheus INFO cherrypy.error] [26/Jan/2026:09:44:18] ENGINE Serving on http://:::9283
Jan 26 09:44:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: [26/Jan/2026:09:44:18] ENGINE Bus STARTED
Jan 26 09:44:18 compute-1 ceph-mgr[80416]: [prometheus INFO cherrypy.error] [26/Jan/2026:09:44:18] ENGINE Bus STARTED
Jan 26 09:44:18 compute-1 ceph-mgr[80416]: [prometheus INFO root] Engine started.
Jan 26 09:44:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:18 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 15 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:18 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 15 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:18.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:18 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 15 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:19 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.1e deep-scrub starts
Jan 26 09:44:19 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.1e deep-scrub ok
Jan 26 09:44:19 compute-1 ceph-mon[80107]: 12.3 scrub starts
Jan 26 09:44:19 compute-1 ceph-mon[80107]: 12.3 scrub ok
Jan 26 09:44:19 compute-1 ceph-mon[80107]: 11.1a scrub starts
Jan 26 09:44:19 compute-1 ceph-mon[80107]: 11.1a scrub ok
Jan 26 09:44:19 compute-1 ceph-mon[80107]: Active manager daemon compute-0.zllcia restarted
Jan 26 09:44:19 compute-1 ceph-mon[80107]: Activating manager daemon compute-0.zllcia
Jan 26 09:44:19 compute-1 ceph-mon[80107]: osdmap e91: 3 total, 3 up, 3 in
Jan 26 09:44:19 compute-1 ceph-mon[80107]: mgrmap e28: compute-0.zllcia(active, starting, since 0.0459661s), standbys: compute-1.xammti, compute-2.oynaeu
Jan 26 09:44:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Jan 26 09:44:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 26 09:44:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 26 09:44:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-0.zhqpiu"}]: dispatch
Jan 26 09:44:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-1.rbkelk"}]: dispatch
Jan 26 09:44:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-2.zprrum"}]: dispatch
Jan 26 09:44:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mgr metadata", "who": "compute-0.zllcia", "id": "compute-0.zllcia"}]: dispatch
Jan 26 09:44:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mgr metadata", "who": "compute-1.xammti", "id": "compute-1.xammti"}]: dispatch
Jan 26 09:44:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mgr metadata", "who": "compute-2.oynaeu", "id": "compute-2.oynaeu"}]: dispatch
Jan 26 09:44:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Jan 26 09:44:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Jan 26 09:44:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 26 09:44:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mds metadata"}]: dispatch
Jan 26 09:44:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 26 09:44:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mon metadata"}]: dispatch
Jan 26 09:44:19 compute-1 ceph-mon[80107]: Manager daemon compute-0.zllcia is now available
Jan 26 09:44:19 compute-1 ceph-mon[80107]: Standby manager daemon compute-1.xammti restarted
Jan 26 09:44:19 compute-1 ceph-mon[80107]: Standby manager daemon compute-1.xammti started
Jan 26 09:44:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:44:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.zllcia/mirror_snapshot_schedule"}]: dispatch
Jan 26 09:44:19 compute-1 ceph-mon[80107]: Standby manager daemon compute-2.oynaeu restarted
Jan 26 09:44:19 compute-1 ceph-mon[80107]: Standby manager daemon compute-2.oynaeu started
Jan 26 09:44:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.zllcia/trash_purge_schedule"}]: dispatch
Jan 26 09:44:19 compute-1 ceph-mon[80107]: 11.1e deep-scrub starts
Jan 26 09:44:19 compute-1 sshd-session[87468]: Accepted publickey for ceph-admin from 192.168.122.100 port 45562 ssh2: RSA SHA256:cGz1g5qmzBfeiAiDRElnaAonZh1cdMIZMAXyGkEzbws
Jan 26 09:44:19 compute-1 systemd-logind[783]: New session 37 of user ceph-admin.
Jan 26 09:44:19 compute-1 systemd[1]: Started Session 37 of User ceph-admin.
Jan 26 09:44:19 compute-1 sshd-session[87468]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 26 09:44:19 compute-1 sudo[87475]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:44:19 compute-1 sudo[87475]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:19 compute-1 sudo[87475]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:19 compute-1 sudo[87500]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 26 09:44:19 compute-1 sudo[87500]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:19 compute-1 podman[87599]: 2026-01-26 09:44:19.897302696 +0000 UTC m=+0.054763931 container exec 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 09:44:19 compute-1 podman[87599]: 2026-01-26 09:44:19.98710402 +0000 UTC m=+0.144565255 container exec_died 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 09:44:20 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.1c scrub starts
Jan 26 09:44:20 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.1c scrub ok
Jan 26 09:44:20 compute-1 ceph-mon[80107]: 12.9 scrub starts
Jan 26 09:44:20 compute-1 ceph-mon[80107]: 12.9 scrub ok
Jan 26 09:44:20 compute-1 ceph-mon[80107]: 11.1e deep-scrub ok
Jan 26 09:44:20 compute-1 ceph-mon[80107]: 8.1e scrub starts
Jan 26 09:44:20 compute-1 ceph-mon[80107]: 8.1e scrub ok
Jan 26 09:44:20 compute-1 ceph-mon[80107]: mgrmap e29: compute-0.zllcia(active, since 1.07102s), standbys: compute-1.xammti, compute-2.oynaeu
Jan 26 09:44:20 compute-1 podman[87733]: 2026-01-26 09:44:20.494912924 +0000 UTC m=+0.068003165 container exec 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 09:44:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 09:44:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:20.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 09:44:20 compute-1 podman[87733]: 2026-01-26 09:44:20.532123655 +0000 UTC m=+0.105213886 container exec_died 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 09:44:20 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:44:20 compute-1 podman[87806]: 2026-01-26 09:44:20.736082599 +0000 UTC m=+0.045009035 container exec 19c026de89e744ddc79168cf6d35f36da879cfc6e36f4542a44b0a0f6b664c36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1)
Jan 26 09:44:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:20 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:20 compute-1 podman[87806]: 2026-01-26 09:44:20.750011273 +0000 UTC m=+0.058937689 container exec_died 19c026de89e744ddc79168cf6d35f36da879cfc6e36f4542a44b0a0f6b664c36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Jan 26 09:44:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:20 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:20.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:20 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a4001320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:20 compute-1 podman[87873]: 2026-01-26 09:44:20.944644604 +0000 UTC m=+0.057217025 container exec 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 09:44:20 compute-1 podman[87873]: 2026-01-26 09:44:20.954058409 +0000 UTC m=+0.066630800 container exec_died 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 09:44:20 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.1b scrub starts
Jan 26 09:44:21 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.1b scrub ok
Jan 26 09:44:21 compute-1 ceph-mon[80107]: 8.d scrub starts
Jan 26 09:44:21 compute-1 ceph-mon[80107]: 8.d scrub ok
Jan 26 09:44:21 compute-1 ceph-mon[80107]: [26/Jan/2026:09:44:19] ENGINE Bus STARTING
Jan 26 09:44:21 compute-1 ceph-mon[80107]: [26/Jan/2026:09:44:19] ENGINE Serving on http://192.168.122.100:8765
Jan 26 09:44:21 compute-1 ceph-mon[80107]: 11.1c scrub starts
Jan 26 09:44:21 compute-1 ceph-mon[80107]: 11.1c scrub ok
Jan 26 09:44:21 compute-1 ceph-mon[80107]: [26/Jan/2026:09:44:20] ENGINE Serving on https://192.168.122.100:7150
Jan 26 09:44:21 compute-1 ceph-mon[80107]: [26/Jan/2026:09:44:20] ENGINE Bus STARTED
Jan 26 09:44:21 compute-1 ceph-mon[80107]: [26/Jan/2026:09:44:20] ENGINE Client ('192.168.122.100', 36132) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Jan 26 09:44:21 compute-1 ceph-mon[80107]: 8.1a scrub starts
Jan 26 09:44:21 compute-1 ceph-mon[80107]: 8.1a scrub ok
Jan 26 09:44:21 compute-1 ceph-mon[80107]: pgmap v4: 353 pgs: 353 active+clean; 457 KiB data, 143 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:44:21 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Jan 26 09:44:21 compute-1 ceph-mon[80107]: 8.1b scrub starts
Jan 26 09:44:21 compute-1 ceph-mon[80107]: 8.1b scrub ok
Jan 26 09:44:21 compute-1 podman[87939]: 2026-01-26 09:44:21.14526337 +0000 UTC m=+0.053084067 container exec 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, distribution-scope=public, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, vcs-type=git, io.buildah.version=1.28.2)
Jan 26 09:44:21 compute-1 podman[87939]: 2026-01-26 09:44:21.170309463 +0000 UTC m=+0.078130140 container exec_died 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, release=1793, architecture=x86_64, vcs-type=git, com.redhat.component=keepalived-container, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4)
Jan 26 09:44:21 compute-1 sudo[87500]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:21 compute-1 sudo[87970]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:44:21 compute-1 sudo[87970]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:21 compute-1 sudo[87970]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:21 compute-1 sudo[87995]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 09:44:21 compute-1 sudo[87995]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:21 compute-1 sudo[87995]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:21 compute-1 sudo[88055]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:44:21 compute-1 sudo[88055]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:21 compute-1 sudo[88055]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:21 compute-1 sudo[88081]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Jan 26 09:44:21 compute-1 sudo[88081]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:21 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.7 scrub starts
Jan 26 09:44:21 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.7 scrub ok
Jan 26 09:44:22 compute-1 sudo[88081]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:44:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:22.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:44:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:22 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:22 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:44:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:22.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:44:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:22 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:22 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.4 scrub starts
Jan 26 09:44:22 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.4 scrub ok
Jan 26 09:44:23 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e92 e92: 3 total, 3 up, 3 in
Jan 26 09:44:23 compute-1 ceph-mon[80107]: 8.3 deep-scrub starts
Jan 26 09:44:23 compute-1 ceph-mon[80107]: 8.3 deep-scrub ok
Jan 26 09:44:23 compute-1 ceph-mon[80107]: mgrmap e30: compute-0.zllcia(active, since 2s), standbys: compute-1.xammti, compute-2.oynaeu
Jan 26 09:44:23 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:23 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:23 compute-1 ceph-mon[80107]: 11.18 scrub starts
Jan 26 09:44:23 compute-1 ceph-mon[80107]: 11.18 scrub ok
Jan 26 09:44:23 compute-1 ceph-mon[80107]: 11.7 scrub starts
Jan 26 09:44:23 compute-1 ceph-mon[80107]: 11.7 scrub ok
Jan 26 09:44:23 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.1b scrub starts
Jan 26 09:44:23 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.1b scrub ok
Jan 26 09:44:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:24.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:24 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a40092a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:24 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:24.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:24 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:24 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.4 scrub starts
Jan 26 09:44:25 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.4 scrub ok
Jan 26 09:44:25 compute-1 sudo[87433]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:25 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:44:25 compute-1 ceph-mon[80107]: 10.1 scrub starts
Jan 26 09:44:25 compute-1 ceph-mon[80107]: 10.1 scrub ok
Jan 26 09:44:25 compute-1 ceph-mon[80107]: pgmap v5: 353 pgs: 353 active+clean; 457 KiB data, 143 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:44:25 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Jan 26 09:44:25 compute-1 ceph-mon[80107]: 10.5 scrub starts
Jan 26 09:44:25 compute-1 ceph-mon[80107]: 10.5 scrub ok
Jan 26 09:44:25 compute-1 ceph-mon[80107]: 8.2 scrub starts
Jan 26 09:44:25 compute-1 ceph-mon[80107]: 8.2 scrub ok
Jan 26 09:44:25 compute-1 ceph-mon[80107]: 11.4 scrub starts
Jan 26 09:44:25 compute-1 ceph-mon[80107]: 11.4 scrub ok
Jan 26 09:44:25 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Jan 26 09:44:25 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:25 compute-1 ceph-mon[80107]: osdmap e92: 3 total, 3 up, 3 in
Jan 26 09:44:25 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:25 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:25 compute-1 ceph-mon[80107]: mgrmap e31: compute-0.zllcia(active, since 5s), standbys: compute-1.xammti, compute-2.oynaeu
Jan 26 09:44:25 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:25 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 26 09:44:25 compute-1 ceph-mon[80107]: 10.13 scrub starts
Jan 26 09:44:25 compute-1 ceph-mon[80107]: 10.13 scrub ok
Jan 26 09:44:25 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:25 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:25 compute-1 ceph-mon[80107]: 11.1b scrub starts
Jan 26 09:44:25 compute-1 ceph-mon[80107]: 11.1b scrub ok
Jan 26 09:44:25 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e93 e93: 3 total, 3 up, 3 in
Jan 26 09:44:25 compute-1 sudo[88166]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Jan 26 09:44:25 compute-1 sudo[88166]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:25 compute-1 sudo[88166]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:25 compute-1 sudo[88191]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph
Jan 26 09:44:25 compute-1 sudo[88191]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:25 compute-1 sudo[88191]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:25 compute-1 sudo[88216]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.conf.new
Jan 26 09:44:25 compute-1 sudo[88216]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:25 compute-1 sudo[88216]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:25 compute-1 sudo[88241]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:44:25 compute-1 sudo[88241]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:25 compute-1 sudo[88241]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:25 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.5 scrub starts
Jan 26 09:44:25 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.5 scrub ok
Jan 26 09:44:26 compute-1 sudo[88266]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.conf.new
Jan 26 09:44:26 compute-1 sudo[88266]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:26 compute-1 sudo[88266]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:26 compute-1 sudo[88314]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.conf.new
Jan 26 09:44:26 compute-1 sudo[88314]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:26 compute-1 sudo[88314]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:26 compute-1 sudo[88339]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.conf.new
Jan 26 09:44:26 compute-1 sudo[88339]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:26 compute-1 sudo[88339]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:26 compute-1 sudo[88364]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Jan 26 09:44:26 compute-1 sudo[88364]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:26 compute-1 sudo[88364]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:26 compute-1 sudo[88389]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config
Jan 26 09:44:26 compute-1 sudo[88389]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:26 compute-1 sudo[88389]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:26 compute-1 sudo[88414]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config
Jan 26 09:44:26 compute-1 sudo[88414]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:26 compute-1 sudo[88414]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:26 compute-1 sudo[88439]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf.new
Jan 26 09:44:26 compute-1 sudo[88439]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:26 compute-1 sudo[88439]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:26 compute-1 sudo[88464]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:44:26 compute-1 sudo[88464]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:26 compute-1 sudo[88464]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:44:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:26.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:44:26 compute-1 sudo[88489]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf.new
Jan 26 09:44:26 compute-1 sudo[88489]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:26 compute-1 sudo[88489]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:26 compute-1 sudo[88537]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf.new
Jan 26 09:44:26 compute-1 sudo[88537]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:26 compute-1 sudo[88537]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:26 compute-1 sudo[88562]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf.new
Jan 26 09:44:26 compute-1 sudo[88562]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:26 compute-1 sudo[88562]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:26 compute-1 sudo[88587]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf.new /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 09:44:26 compute-1 sudo[88587]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:26 compute-1 sudo[88587]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:26 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:26 compute-1 sudo[88612]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Jan 26 09:44:26 compute-1 sudo[88612]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:26 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a40092a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:26 compute-1 sudo[88612]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:26.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:26 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a40092a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:26 compute-1 sudo[88637]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph
Jan 26 09:44:26 compute-1 sudo[88637]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:26 compute-1 sudo[88637]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:26 compute-1 sudo[88662]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.client.admin.keyring.new
Jan 26 09:44:26 compute-1 sudo[88662]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:26 compute-1 sudo[88662]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:26 compute-1 sudo[88687]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:44:26 compute-1 sudo[88687]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:26 compute-1 sudo[88687]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:26 compute-1 ceph-mon[80107]: 11.3 scrub starts
Jan 26 09:44:26 compute-1 ceph-mon[80107]: 11.3 scrub ok
Jan 26 09:44:26 compute-1 ceph-mon[80107]: pgmap v7: 353 pgs: 353 active+clean; 457 KiB data, 143 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:44:26 compute-1 ceph-mon[80107]: 12.12 scrub starts
Jan 26 09:44:26 compute-1 ceph-mon[80107]: 12.4 scrub starts
Jan 26 09:44:26 compute-1 ceph-mon[80107]: 12.12 scrub ok
Jan 26 09:44:26 compute-1 ceph-mon[80107]: 12.4 scrub ok
Jan 26 09:44:26 compute-1 ceph-mon[80107]: 8.4 scrub starts
Jan 26 09:44:26 compute-1 ceph-mon[80107]: 8.4 scrub ok
Jan 26 09:44:26 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Jan 26 09:44:26 compute-1 ceph-mon[80107]: 12.e scrub starts
Jan 26 09:44:26 compute-1 ceph-mon[80107]: 8.9 scrub starts
Jan 26 09:44:26 compute-1 ceph-mon[80107]: 8.9 scrub ok
Jan 26 09:44:26 compute-1 ceph-mon[80107]: 12.e scrub ok
Jan 26 09:44:26 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Jan 26 09:44:26 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:26 compute-1 ceph-mon[80107]: osdmap e93: 3 total, 3 up, 3 in
Jan 26 09:44:26 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:26 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:26 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 26 09:44:26 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:26 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 26 09:44:26 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:44:26 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 09:44:26 compute-1 ceph-mon[80107]: Updating compute-0:/etc/ceph/ceph.conf
Jan 26 09:44:26 compute-1 ceph-mon[80107]: Updating compute-1:/etc/ceph/ceph.conf
Jan 26 09:44:26 compute-1 ceph-mon[80107]: Updating compute-2:/etc/ceph/ceph.conf
Jan 26 09:44:26 compute-1 ceph-mon[80107]: 11.5 scrub starts
Jan 26 09:44:26 compute-1 ceph-mon[80107]: 11.5 scrub ok
Jan 26 09:44:26 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Jan 26 09:44:26 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e94 e94: 3 total, 3 up, 3 in
Jan 26 09:44:27 compute-1 sudo[88712]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.client.admin.keyring.new
Jan 26 09:44:27 compute-1 sudo[88712]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:27 compute-1 sudo[88712]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:27 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.1 scrub starts
Jan 26 09:44:27 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.1 scrub ok
Jan 26 09:44:27 compute-1 sudo[88760]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.client.admin.keyring.new
Jan 26 09:44:27 compute-1 sudo[88760]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:27 compute-1 sudo[88760]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:27 compute-1 sudo[88785]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.client.admin.keyring.new
Jan 26 09:44:27 compute-1 sudo[88785]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:27 compute-1 sudo[88785]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:27 compute-1 sudo[88810]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Jan 26 09:44:27 compute-1 sudo[88810]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:27 compute-1 sudo[88810]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:27 compute-1 sudo[88835]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config
Jan 26 09:44:27 compute-1 sudo[88835]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:27 compute-1 sudo[88835]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:27 compute-1 sudo[88860]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config
Jan 26 09:44:27 compute-1 sudo[88860]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:27 compute-1 sudo[88860]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:27 compute-1 sudo[88885]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring.new
Jan 26 09:44:27 compute-1 sudo[88885]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:27 compute-1 sudo[88885]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:27 compute-1 sudo[88910]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:44:27 compute-1 sudo[88910]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:27 compute-1 sudo[88910]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:27 compute-1 sudo[88935]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring.new
Jan 26 09:44:27 compute-1 sudo[88935]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:27 compute-1 sudo[88935]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:27 compute-1 sudo[88984]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring.new
Jan 26 09:44:27 compute-1 sudo[88984]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:27 compute-1 sudo[88984]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:27 compute-1 sudo[89009]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring.new
Jan 26 09:44:27 compute-1 sudo[89009]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:27 compute-1 sudo[89009]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:27 compute-1 sudo[89034]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1a70b85d-e3fd-5814-8a6a-37ea00fcae30/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring.new /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring
Jan 26 09:44:27 compute-1 sudo[89034]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:27 compute-1 sudo[89034]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:27 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.f scrub starts
Jan 26 09:44:28 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.f scrub ok
Jan 26 09:44:28 compute-1 ceph-mon[80107]: Updating compute-1:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 09:44:28 compute-1 ceph-mon[80107]: Updating compute-0:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 09:44:28 compute-1 ceph-mon[80107]: Updating compute-2:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 09:44:28 compute-1 ceph-mon[80107]: pgmap v9: 353 pgs: 353 active+clean; 457 KiB data, 143 MiB used, 60 GiB / 60 GiB avail; 24 KiB/s rd, 0 B/s wr, 8 op/s
Jan 26 09:44:28 compute-1 ceph-mon[80107]: 12.b scrub starts
Jan 26 09:44:28 compute-1 ceph-mon[80107]: 12.b scrub ok
Jan 26 09:44:28 compute-1 ceph-mon[80107]: 8.1c scrub starts
Jan 26 09:44:28 compute-1 ceph-mon[80107]: 8.1c scrub ok
Jan 26 09:44:28 compute-1 ceph-mon[80107]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Jan 26 09:44:28 compute-1 ceph-mon[80107]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Jan 26 09:44:28 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Jan 26 09:44:28 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Jan 26 09:44:28 compute-1 ceph-mon[80107]: osdmap e94: 3 total, 3 up, 3 in
Jan 26 09:44:28 compute-1 ceph-mon[80107]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 26 09:44:28 compute-1 ceph-mon[80107]: 12.c deep-scrub starts
Jan 26 09:44:28 compute-1 sshd-session[87070]: Connection closed by 192.168.122.30 port 42858
Jan 26 09:44:28 compute-1 sshd-session[87067]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:44:28 compute-1 systemd[1]: session-36.scope: Deactivated successfully.
Jan 26 09:44:28 compute-1 systemd[1]: session-36.scope: Consumed 8.048s CPU time.
Jan 26 09:44:28 compute-1 systemd-logind[783]: Session 36 logged out. Waiting for processes to exit.
Jan 26 09:44:28 compute-1 systemd-logind[783]: Removed session 36.
Jan 26 09:44:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:44:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:28.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:44:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:28 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:28 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:44:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:28.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:44:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:28 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:28 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.8 deep-scrub starts
Jan 26 09:44:28 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.8 deep-scrub ok
Jan 26 09:44:29 compute-1 ceph-mon[80107]: 11.1 scrub starts
Jan 26 09:44:29 compute-1 ceph-mon[80107]: 11.1 scrub ok
Jan 26 09:44:29 compute-1 ceph-mon[80107]: Updating compute-1:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring
Jan 26 09:44:29 compute-1 ceph-mon[80107]: Updating compute-0:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring
Jan 26 09:44:29 compute-1 ceph-mon[80107]: 10.4 scrub starts
Jan 26 09:44:29 compute-1 ceph-mon[80107]: 10.4 scrub ok
Jan 26 09:44:29 compute-1 ceph-mon[80107]: Updating compute-2:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring
Jan 26 09:44:29 compute-1 ceph-mon[80107]: 11.f scrub starts
Jan 26 09:44:29 compute-1 ceph-mon[80107]: 11.f scrub ok
Jan 26 09:44:29 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:29 compute-1 ceph-mon[80107]: 12.c deep-scrub ok
Jan 26 09:44:29 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:29 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:29 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:29 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:29 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Jan 26 09:44:29 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:29 compute-1 ceph-mon[80107]: 12.a scrub starts
Jan 26 09:44:29 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:29 compute-1 ceph-mon[80107]: 12.a scrub ok
Jan 26 09:44:29 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:29 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 09:44:29 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 09:44:29 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:44:29 compute-1 ceph-mon[80107]: 8.8 deep-scrub starts
Jan 26 09:44:29 compute-1 ceph-mon[80107]: 8.8 deep-scrub ok
Jan 26 09:44:29 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e95 e95: 3 total, 3 up, 3 in
Jan 26 09:44:29 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 95 pg[9.d( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=95) [1] r=0 lpr=95 pi=[79,95)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:44:29 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 95 pg[9.1d( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=78/78 les/c/f=79/79/0 sis=95) [1] r=0 lpr=95 pi=[78,95)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:44:29 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.17 scrub starts
Jan 26 09:44:30 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.17 scrub ok
Jan 26 09:44:30 compute-1 ceph-mon[80107]: pgmap v11: 353 pgs: 353 active+clean; 457 KiB data, 143 MiB used, 60 GiB / 60 GiB avail; 28 KiB/s rd, 0 B/s wr, 10 op/s
Jan 26 09:44:30 compute-1 ceph-mon[80107]: 8.15 scrub starts
Jan 26 09:44:30 compute-1 ceph-mon[80107]: 8.15 scrub ok
Jan 26 09:44:30 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Jan 26 09:44:30 compute-1 ceph-mon[80107]: osdmap e95: 3 total, 3 up, 3 in
Jan 26 09:44:30 compute-1 ceph-mon[80107]: 12.6 scrub starts
Jan 26 09:44:30 compute-1 ceph-mon[80107]: 12.6 scrub ok
Jan 26 09:44:30 compute-1 ceph-mon[80107]: 8.17 scrub starts
Jan 26 09:44:30 compute-1 ceph-mon[80107]: 8.17 scrub ok
Jan 26 09:44:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:30.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:30 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e96 e96: 3 total, 3 up, 3 in
Jan 26 09:44:30 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 96 pg[9.d( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=96) [1]/[2] r=-1 lpr=96 pi=[79,96)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:30 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 96 pg[9.d( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=96) [1]/[2] r=-1 lpr=96 pi=[79,96)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 09:44:30 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 96 pg[9.1d( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=78/78 les/c/f=79/79/0 sis=96) [1]/[2] r=-1 lpr=96 pi=[78,96)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:30 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 96 pg[9.1d( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=78/78 les/c/f=79/79/0 sis=96) [1]/[2] r=-1 lpr=96 pi=[78,96)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 09:44:30 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:44:30 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:30 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488002550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:30 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:30 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:30.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:30 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:30 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:30 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.14 scrub starts
Jan 26 09:44:30 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.14 scrub ok
Jan 26 09:44:31 compute-1 ceph-mon[80107]: 12.13 scrub starts
Jan 26 09:44:31 compute-1 ceph-mon[80107]: 12.13 scrub ok
Jan 26 09:44:31 compute-1 ceph-mon[80107]: pgmap v13: 353 pgs: 353 active+clean; 457 KiB data, 143 MiB used, 60 GiB / 60 GiB avail; 24 KiB/s rd, 5 op/s
Jan 26 09:44:31 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Jan 26 09:44:31 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Jan 26 09:44:31 compute-1 ceph-mon[80107]: osdmap e96: 3 total, 3 up, 3 in
Jan 26 09:44:31 compute-1 ceph-mon[80107]: 12.1c scrub starts
Jan 26 09:44:31 compute-1 ceph-mon[80107]: 12.1c scrub ok
Jan 26 09:44:31 compute-1 ceph-mon[80107]: 8.14 scrub starts
Jan 26 09:44:31 compute-1 ceph-mon[80107]: 8.14 scrub ok
Jan 26 09:44:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094431 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:44:31 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e97 e97: 3 total, 3 up, 3 in
Jan 26 09:44:31 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.12 scrub starts
Jan 26 09:44:31 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.12 scrub ok
Jan 26 09:44:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:44:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:32.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:44:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:32 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:32 compute-1 ceph-mon[80107]: 12.11 scrub starts
Jan 26 09:44:32 compute-1 ceph-mon[80107]: 12.11 scrub ok
Jan 26 09:44:32 compute-1 ceph-mon[80107]: osdmap e97: 3 total, 3 up, 3 in
Jan 26 09:44:32 compute-1 ceph-mon[80107]: 10.8 scrub starts
Jan 26 09:44:32 compute-1 ceph-mon[80107]: 10.8 scrub ok
Jan 26 09:44:32 compute-1 ceph-mon[80107]: 11.12 scrub starts
Jan 26 09:44:32 compute-1 ceph-mon[80107]: 11.12 scrub ok
Jan 26 09:44:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:32 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488002550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:32.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:32 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e98 e98: 3 total, 3 up, 3 in
Jan 26 09:44:32 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 98 pg[9.1f( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=75/75 les/c/f=76/76/0 sis=98) [1] r=0 lpr=98 pi=[75,98)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:44:32 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 98 pg[9.f( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=75/75 les/c/f=76/76/0 sis=98) [1] r=0 lpr=98 pi=[75,98)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:44:32 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 98 pg[9.d( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=6 ec=63/48 lis/c=96/79 les/c/f=97/80/0 sis=98) [1] r=0 lpr=98 pi=[79,98)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:32 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 98 pg[9.d( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=6 ec=63/48 lis/c=96/79 les/c/f=97/80/0 sis=98) [1] r=0 lpr=98 pi=[79,98)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:44:32 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 98 pg[9.1d( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=5 ec=63/48 lis/c=96/78 les/c/f=97/79/0 sis=98) [1] r=0 lpr=98 pi=[78,98)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:32 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 98 pg[9.1d( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=5 ec=63/48 lis/c=96/78 les/c/f=97/79/0 sis=98) [1] r=0 lpr=98 pi=[78,98)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:44:32 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.1d scrub starts
Jan 26 09:44:32 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.1d scrub ok
Jan 26 09:44:33 compute-1 ceph-mon[80107]: 11.16 deep-scrub starts
Jan 26 09:44:33 compute-1 ceph-mon[80107]: 11.16 deep-scrub ok
Jan 26 09:44:33 compute-1 ceph-mon[80107]: pgmap v16: 353 pgs: 353 active+clean; 457 KiB data, 143 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:44:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Jan 26 09:44:33 compute-1 ceph-mon[80107]: 11.e scrub starts
Jan 26 09:44:33 compute-1 ceph-mon[80107]: 11.e scrub ok
Jan 26 09:44:33 compute-1 ceph-mon[80107]: 12.8 scrub starts
Jan 26 09:44:33 compute-1 ceph-mon[80107]: 12.8 scrub ok
Jan 26 09:44:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Jan 26 09:44:33 compute-1 ceph-mon[80107]: osdmap e98: 3 total, 3 up, 3 in
Jan 26 09:44:33 compute-1 ceph-mon[80107]: 11.1d scrub starts
Jan 26 09:44:33 compute-1 ceph-mon[80107]: 11.1d scrub ok
Jan 26 09:44:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:44:33 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.18 scrub starts
Jan 26 09:44:33 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.18 scrub ok
Jan 26 09:44:33 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e99 e99: 3 total, 3 up, 3 in
Jan 26 09:44:33 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 99 pg[9.f( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=75/75 les/c/f=76/76/0 sis=99) [1]/[2] r=-1 lpr=99 pi=[75,99)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:33 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 99 pg[9.1f( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=75/75 les/c/f=76/76/0 sis=99) [1]/[2] r=-1 lpr=99 pi=[75,99)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:33 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 99 pg[9.f( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=75/75 les/c/f=76/76/0 sis=99) [1]/[2] r=-1 lpr=99 pi=[75,99)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 09:44:33 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 99 pg[9.1f( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=75/75 les/c/f=76/76/0 sis=99) [1]/[2] r=-1 lpr=99 pi=[75,99)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 09:44:33 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 99 pg[9.d( v 60'1159 (0'0,60'1159] local-lis/les=98/99 n=6 ec=63/48 lis/c=96/79 les/c/f=97/80/0 sis=98) [1] r=0 lpr=98 pi=[79,98)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:44:33 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 99 pg[9.1d( v 60'1159 (0'0,60'1159] local-lis/les=98/99 n=5 ec=63/48 lis/c=96/78 les/c/f=97/79/0 sis=98) [1] r=0 lpr=98 pi=[78,98)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:44:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:34.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:34 compute-1 sudo[89064]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:44:34 compute-1 sudo[89064]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:34 compute-1 sudo[89064]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:34 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:34 compute-1 ceph-mon[80107]: 8.16 scrub starts
Jan 26 09:44:34 compute-1 ceph-mon[80107]: 8.16 scrub ok
Jan 26 09:44:34 compute-1 ceph-mon[80107]: 12.19 deep-scrub starts
Jan 26 09:44:34 compute-1 ceph-mon[80107]: 12.19 deep-scrub ok
Jan 26 09:44:34 compute-1 ceph-mon[80107]: 8.18 scrub starts
Jan 26 09:44:34 compute-1 ceph-mon[80107]: 8.18 scrub ok
Jan 26 09:44:34 compute-1 ceph-mon[80107]: osdmap e99: 3 total, 3 up, 3 in
Jan 26 09:44:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Jan 26 09:44:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:34 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:34 compute-1 sudo[89089]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 09:44:34 compute-1 sudo[89089]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:34 compute-1 sudo[89089]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 09:44:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:34.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 09:44:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:34 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:34 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.14 scrub starts
Jan 26 09:44:34 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.14 scrub ok
Jan 26 09:44:34 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e100 e100: 3 total, 3 up, 3 in
Jan 26 09:44:34 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 100 pg[9.10( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=100) [1] r=0 lpr=100 pi=[63,100)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:44:35 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:44:35 compute-1 ceph-mon[80107]: pgmap v19: 353 pgs: 353 active+clean; 457 KiB data, 143 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:44:35 compute-1 ceph-mon[80107]: 8.a scrub starts
Jan 26 09:44:35 compute-1 ceph-mon[80107]: 8.a scrub ok
Jan 26 09:44:35 compute-1 ceph-mon[80107]: 10.2 scrub starts
Jan 26 09:44:35 compute-1 ceph-mon[80107]: 10.2 scrub ok
Jan 26 09:44:35 compute-1 ceph-mon[80107]: Reconfiguring node-exporter.compute-0 (unknown last config time)...
Jan 26 09:44:35 compute-1 ceph-mon[80107]: Reconfiguring daemon node-exporter.compute-0 on compute-0
Jan 26 09:44:35 compute-1 ceph-mon[80107]: 11.14 scrub starts
Jan 26 09:44:35 compute-1 ceph-mon[80107]: 11.14 scrub ok
Jan 26 09:44:35 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Jan 26 09:44:35 compute-1 ceph-mon[80107]: osdmap e100: 3 total, 3 up, 3 in
Jan 26 09:44:35 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.19 scrub starts
Jan 26 09:44:35 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.19 scrub ok
Jan 26 09:44:35 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e101 e101: 3 total, 3 up, 3 in
Jan 26 09:44:35 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 101 pg[9.10( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=101) [1]/[0] r=-1 lpr=101 pi=[63,101)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:35 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 101 pg[9.10( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=101) [1]/[0] r=-1 lpr=101 pi=[63,101)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 09:44:35 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 101 pg[9.f( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=6 ec=63/48 lis/c=99/75 les/c/f=100/76/0 sis=101) [1] r=0 lpr=101 pi=[75,101)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:35 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 101 pg[9.f( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=6 ec=63/48 lis/c=99/75 les/c/f=100/76/0 sis=101) [1] r=0 lpr=101 pi=[75,101)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:44:35 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 101 pg[9.1f( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=5 ec=63/48 lis/c=99/75 les/c/f=100/76/0 sis=101) [1] r=0 lpr=101 pi=[75,101)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:35 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 101 pg[9.1f( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=5 ec=63/48 lis/c=99/75 les/c/f=100/76/0 sis=101) [1] r=0 lpr=101 pi=[75,101)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:44:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:36.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:36 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:36 compute-1 ceph-mon[80107]: 8.b scrub starts
Jan 26 09:44:36 compute-1 ceph-mon[80107]: 8.b scrub ok
Jan 26 09:44:36 compute-1 ceph-mon[80107]: 10.19 deep-scrub starts
Jan 26 09:44:36 compute-1 ceph-mon[80107]: 10.19 deep-scrub ok
Jan 26 09:44:36 compute-1 ceph-mon[80107]: 8.19 scrub starts
Jan 26 09:44:36 compute-1 ceph-mon[80107]: 8.19 scrub ok
Jan 26 09:44:36 compute-1 ceph-mon[80107]: osdmap e101: 3 total, 3 up, 3 in
Jan 26 09:44:36 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:36 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:36 compute-1 ceph-mon[80107]: Reconfiguring alertmanager.compute-0 (dependencies changed)...
Jan 26 09:44:36 compute-1 ceph-mon[80107]: Reconfiguring daemon alertmanager.compute-0 on compute-0
Jan 26 09:44:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:36 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:36.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:36 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:36 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.d scrub starts
Jan 26 09:44:36 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e102 e102: 3 total, 3 up, 3 in
Jan 26 09:44:36 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.d scrub ok
Jan 26 09:44:36 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 102 pg[9.f( v 60'1159 (0'0,60'1159] local-lis/les=101/102 n=6 ec=63/48 lis/c=99/75 les/c/f=100/76/0 sis=101) [1] r=0 lpr=101 pi=[75,101)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:44:36 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 102 pg[9.1f( v 60'1159 (0'0,60'1159] local-lis/les=101/102 n=5 ec=63/48 lis/c=99/75 les/c/f=100/76/0 sis=101) [1] r=0 lpr=101 pi=[75,101)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:44:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e103 e103: 3 total, 3 up, 3 in
Jan 26 09:44:37 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 103 pg[9.10( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=2 ec=63/48 lis/c=101/63 les/c/f=102/65/0 sis=103) [1] r=0 lpr=103 pi=[63,103)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:37 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 103 pg[9.10( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=2 ec=63/48 lis/c=101/63 les/c/f=102/65/0 sis=103) [1] r=0 lpr=103 pi=[63,103)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:44:37 compute-1 ceph-mon[80107]: pgmap v22: 353 pgs: 2 remapped+peering, 351 active+clean; 457 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 82 B/s, 3 objects/s recovering
Jan 26 09:44:37 compute-1 ceph-mon[80107]: 12.18 scrub starts
Jan 26 09:44:37 compute-1 ceph-mon[80107]: 12.18 scrub ok
Jan 26 09:44:37 compute-1 ceph-mon[80107]: 10.18 scrub starts
Jan 26 09:44:37 compute-1 ceph-mon[80107]: 10.18 scrub ok
Jan 26 09:44:37 compute-1 ceph-mon[80107]: 9.d scrub starts
Jan 26 09:44:37 compute-1 ceph-mon[80107]: 9.d scrub ok
Jan 26 09:44:37 compute-1 ceph-mon[80107]: osdmap e102: 3 total, 3 up, 3 in
Jan 26 09:44:37 compute-1 ceph-mon[80107]: osdmap e103: 3 total, 3 up, 3 in
Jan 26 09:44:37 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.1a scrub starts
Jan 26 09:44:37 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.1a scrub ok
Jan 26 09:44:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:38.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:38 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e104 e104: 3 total, 3 up, 3 in
Jan 26 09:44:38 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 104 pg[9.10( v 60'1159 (0'0,60'1159] local-lis/les=103/104 n=2 ec=63/48 lis/c=101/63 les/c/f=102/65/0 sis=103) [1] r=0 lpr=103 pi=[63,103)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:44:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:38 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488002550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:38 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488002550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:38 compute-1 ceph-mon[80107]: 12.1a scrub starts
Jan 26 09:44:38 compute-1 ceph-mon[80107]: 12.1a scrub ok
Jan 26 09:44:38 compute-1 ceph-mon[80107]: 10.14 scrub starts
Jan 26 09:44:38 compute-1 ceph-mon[80107]: 10.14 scrub ok
Jan 26 09:44:38 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:38 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:38 compute-1 ceph-mon[80107]: Reconfiguring grafana.compute-0 (dependencies changed)...
Jan 26 09:44:38 compute-1 ceph-mon[80107]: Reconfiguring daemon grafana.compute-0 on compute-0
Jan 26 09:44:38 compute-1 ceph-mon[80107]: 9.1a scrub starts
Jan 26 09:44:38 compute-1 ceph-mon[80107]: 9.1a scrub ok
Jan 26 09:44:38 compute-1 ceph-mon[80107]: osdmap e104: 3 total, 3 up, 3 in
Jan 26 09:44:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:44:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:38.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:44:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:38 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:38 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.1e scrub starts
Jan 26 09:44:38 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.1e scrub ok
Jan 26 09:44:39 compute-1 ceph-mon[80107]: pgmap v25: 353 pgs: 2 remapped+peering, 351 active+clean; 457 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 82 B/s, 3 objects/s recovering
Jan 26 09:44:39 compute-1 ceph-mon[80107]: 9.15 scrub starts
Jan 26 09:44:39 compute-1 ceph-mon[80107]: 9.15 scrub ok
Jan 26 09:44:39 compute-1 ceph-mon[80107]: 10.15 scrub starts
Jan 26 09:44:39 compute-1 ceph-mon[80107]: 10.15 scrub ok
Jan 26 09:44:39 compute-1 ceph-mon[80107]: 9.1e scrub starts
Jan 26 09:44:39 compute-1 ceph-mon[80107]: 9.1e scrub ok
Jan 26 09:44:39 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.1d scrub starts
Jan 26 09:44:39 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.1d scrub ok
Jan 26 09:44:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:40.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:40 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e104 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:44:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:40 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488002550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:40 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488002550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:40.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:40 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:40 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:44:40 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.f scrub starts
Jan 26 09:44:40 compute-1 ceph-mon[80107]: 9.9 scrub starts
Jan 26 09:44:40 compute-1 ceph-mon[80107]: 9.9 scrub ok
Jan 26 09:44:40 compute-1 ceph-mon[80107]: 12.10 scrub starts
Jan 26 09:44:40 compute-1 ceph-mon[80107]: 12.10 scrub ok
Jan 26 09:44:40 compute-1 ceph-mon[80107]: 9.1d scrub starts
Jan 26 09:44:40 compute-1 ceph-mon[80107]: 9.1d scrub ok
Jan 26 09:44:40 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Jan 26 09:44:40 compute-1 ceph-mon[80107]: 9.14 scrub starts
Jan 26 09:44:40 compute-1 ceph-mon[80107]: 9.14 scrub ok
Jan 26 09:44:40 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e105 e105: 3 total, 3 up, 3 in
Jan 26 09:44:40 compute-1 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.f scrub ok
Jan 26 09:44:41 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 105 pg[9.11( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=105) [1] r=0 lpr=105 pi=[63,105)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:44:41 compute-1 sudo[89117]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:44:41 compute-1 sudo[89117]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:41 compute-1 sudo[89117]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:41 compute-1 sudo[89142]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 09:44:41 compute-1 sudo[89142]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:41 compute-1 podman[89183]: 2026-01-26 09:44:41.456821987 +0000 UTC m=+0.036400771 container create 0b2fbc6cc3f055852090c599c19075968ea058f0b3a72a077f076777f3faf255 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_chatterjee, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:44:41 compute-1 systemd[84066]: Starting Mark boot as successful...
Jan 26 09:44:41 compute-1 systemd[1]: Started libpod-conmon-0b2fbc6cc3f055852090c599c19075968ea058f0b3a72a077f076777f3faf255.scope.
Jan 26 09:44:41 compute-1 systemd[84066]: Finished Mark boot as successful.
Jan 26 09:44:41 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:44:41 compute-1 podman[89183]: 2026-01-26 09:44:41.50945272 +0000 UTC m=+0.089031494 container init 0b2fbc6cc3f055852090c599c19075968ea058f0b3a72a077f076777f3faf255 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_chatterjee, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 09:44:41 compute-1 podman[89183]: 2026-01-26 09:44:41.515465888 +0000 UTC m=+0.095044672 container start 0b2fbc6cc3f055852090c599c19075968ea058f0b3a72a077f076777f3faf255 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_chatterjee, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True)
Jan 26 09:44:41 compute-1 podman[89183]: 2026-01-26 09:44:41.518180478 +0000 UTC m=+0.097759262 container attach 0b2fbc6cc3f055852090c599c19075968ea058f0b3a72a077f076777f3faf255 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_chatterjee, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1)
Jan 26 09:44:41 compute-1 unruffled_chatterjee[89201]: 167 167
Jan 26 09:44:41 compute-1 systemd[1]: libpod-0b2fbc6cc3f055852090c599c19075968ea058f0b3a72a077f076777f3faf255.scope: Deactivated successfully.
Jan 26 09:44:41 compute-1 podman[89183]: 2026-01-26 09:44:41.520984802 +0000 UTC m=+0.100563576 container died 0b2fbc6cc3f055852090c599c19075968ea058f0b3a72a077f076777f3faf255 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_chatterjee, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 09:44:41 compute-1 podman[89183]: 2026-01-26 09:44:41.440808198 +0000 UTC m=+0.020386992 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:44:41 compute-1 systemd[1]: var-lib-containers-storage-overlay-d11027f73f59693698317623e656a37bfa891ba8c1611aa47bed5adef81e33fe-merged.mount: Deactivated successfully.
Jan 26 09:44:41 compute-1 podman[89183]: 2026-01-26 09:44:41.552020612 +0000 UTC m=+0.131599396 container remove 0b2fbc6cc3f055852090c599c19075968ea058f0b3a72a077f076777f3faf255 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_chatterjee, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 26 09:44:41 compute-1 systemd[1]: libpod-conmon-0b2fbc6cc3f055852090c599c19075968ea058f0b3a72a077f076777f3faf255.scope: Deactivated successfully.
Jan 26 09:44:41 compute-1 sudo[89142]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:42 compute-1 ceph-mon[80107]: pgmap v27: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:44:42 compute-1 ceph-mon[80107]: 9.19 scrub starts
Jan 26 09:44:42 compute-1 ceph-mon[80107]: 9.19 scrub ok
Jan 26 09:44:42 compute-1 ceph-mon[80107]: 9.f scrub starts
Jan 26 09:44:42 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Jan 26 09:44:42 compute-1 ceph-mon[80107]: osdmap e105: 3 total, 3 up, 3 in
Jan 26 09:44:42 compute-1 ceph-mon[80107]: 9.f scrub ok
Jan 26 09:44:42 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:42 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:42 compute-1 ceph-mon[80107]: Reconfiguring rgw.rgw.compute-1.fbcidm (unknown last config time)...
Jan 26 09:44:42 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.fbcidm", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 26 09:44:42 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:44:42 compute-1 ceph-mon[80107]: Reconfiguring daemon rgw.rgw.compute-1.fbcidm on compute-1
Jan 26 09:44:42 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:42 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:42 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch
Jan 26 09:44:42 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch
Jan 26 09:44:42 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Jan 26 09:44:42 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:42 compute-1 ceph-mon[80107]: 9.c scrub starts
Jan 26 09:44:42 compute-1 ceph-mon[80107]: 9.c scrub ok
Jan 26 09:44:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e106 e106: 3 total, 3 up, 3 in
Jan 26 09:44:42 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 106 pg[9.11( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=106) [1]/[0] r=-1 lpr=106 pi=[63,106)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:42 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 106 pg[9.11( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=106) [1]/[0] r=-1 lpr=106 pi=[63,106)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 09:44:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:42.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:42 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:42 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:42 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:42 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:42.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:42 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:42 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:43 compute-1 ceph-mon[80107]: 9.1b scrub starts
Jan 26 09:44:43 compute-1 ceph-mon[80107]: 9.1b scrub ok
Jan 26 09:44:43 compute-1 ceph-mon[80107]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch
Jan 26 09:44:43 compute-1 ceph-mon[80107]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch
Jan 26 09:44:43 compute-1 ceph-mon[80107]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Jan 26 09:44:43 compute-1 ceph-mon[80107]: osdmap e106: 3 total, 3 up, 3 in
Jan 26 09:44:43 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Jan 26 09:44:43 compute-1 ceph-mon[80107]: 9.1 scrub starts
Jan 26 09:44:43 compute-1 ceph-mon[80107]: 9.1 scrub ok
Jan 26 09:44:43 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e107 e107: 3 total, 3 up, 3 in
Jan 26 09:44:43 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 107 pg[9.12( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=107) [1] r=0 lpr=107 pi=[63,107)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:44:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:43 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:44:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:43 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:44:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:43 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:44:44 compute-1 ceph-mon[80107]: pgmap v30: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:44:44 compute-1 ceph-mon[80107]: 9.18 scrub starts
Jan 26 09:44:44 compute-1 ceph-mon[80107]: 9.18 scrub ok
Jan 26 09:44:44 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Jan 26 09:44:44 compute-1 ceph-mon[80107]: osdmap e107: 3 total, 3 up, 3 in
Jan 26 09:44:44 compute-1 ceph-mon[80107]: 9.0 scrub starts
Jan 26 09:44:44 compute-1 ceph-mon[80107]: 9.0 scrub ok
Jan 26 09:44:44 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e108 e108: 3 total, 3 up, 3 in
Jan 26 09:44:44 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 108 pg[9.11( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=5 ec=63/48 lis/c=106/63 les/c/f=107/65/0 sis=108) [1] r=0 lpr=108 pi=[63,108)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:44 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 108 pg[9.11( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=5 ec=63/48 lis/c=106/63 les/c/f=107/65/0 sis=108) [1] r=0 lpr=108 pi=[63,108)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:44:44 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 108 pg[9.12( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=108) [1]/[0] r=-1 lpr=108 pi=[63,108)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:44 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 108 pg[9.12( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=108) [1]/[0] r=-1 lpr=108 pi=[63,108)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 09:44:44 compute-1 sshd-session[89219]: Accepted publickey for zuul from 192.168.122.30 port 40942 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:44:44 compute-1 systemd-logind[783]: New session 38 of user zuul.
Jan 26 09:44:44 compute-1 systemd[1]: Started Session 38 of User zuul.
Jan 26 09:44:44 compute-1 sshd-session[89219]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:44:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:44.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:44 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:44 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 09:44:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:44.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 09:44:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:44 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:45 compute-1 python3.9[89372]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 26 09:44:45 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:44:45 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e109 e109: 3 total, 3 up, 3 in
Jan 26 09:44:45 compute-1 ceph-mon[80107]: 9.7 scrub starts
Jan 26 09:44:45 compute-1 ceph-mon[80107]: 9.7 scrub ok
Jan 26 09:44:45 compute-1 ceph-mon[80107]: osdmap e108: 3 total, 3 up, 3 in
Jan 26 09:44:45 compute-1 ceph-mon[80107]: pgmap v33: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:44:45 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Jan 26 09:44:45 compute-1 ceph-mon[80107]: 9.b scrub starts
Jan 26 09:44:45 compute-1 ceph-mon[80107]: 9.b scrub ok
Jan 26 09:44:45 compute-1 ceph-mon[80107]: 9.2 scrub starts
Jan 26 09:44:45 compute-1 ceph-mon[80107]: 9.2 scrub ok
Jan 26 09:44:45 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 109 pg[9.11( v 60'1159 (0'0,60'1159] local-lis/les=108/109 n=5 ec=63/48 lis/c=106/63 les/c/f=107/65/0 sis=108) [1] r=0 lpr=108 pi=[63,108)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:44:46 compute-1 python3.9[89547]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:44:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:46.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:46 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:46 compute-1 ceph-mon[80107]: 9.3 scrub starts
Jan 26 09:44:46 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Jan 26 09:44:46 compute-1 ceph-mon[80107]: 9.3 scrub ok
Jan 26 09:44:46 compute-1 ceph-mon[80107]: osdmap e109: 3 total, 3 up, 3 in
Jan 26 09:44:46 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:46 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:44:46 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 09:44:46 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:46 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:46 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 09:44:46 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 09:44:46 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:44:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:46 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488003880 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:46 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e110 e110: 3 total, 3 up, 3 in
Jan 26 09:44:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:46 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:46 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 110 pg[9.12( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=4 ec=63/48 lis/c=108/63 les/c/f=109/65/0 sis=110) [1] r=0 lpr=110 pi=[63,110)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:46 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 110 pg[9.12( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=4 ec=63/48 lis/c=108/63 les/c/f=109/65/0 sis=110) [1] r=0 lpr=110 pi=[63,110)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:44:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:46.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:46 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:46 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 09:44:47 compute-1 sudo[89701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwzudgozclgcocqntygoimfafqdlwgkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420687.012526-89-178038450960173/AnsiballZ_command.py'
Jan 26 09:44:47 compute-1 sudo[89701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:44:47 compute-1 python3.9[89703]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:44:47 compute-1 sudo[89701]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:47 compute-1 ceph-mon[80107]: pgmap v35: 353 pgs: 1 remapped+peering, 1 peering, 351 active+clean; 457 KiB data, 152 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:44:47 compute-1 ceph-mon[80107]: 9.17 scrub starts
Jan 26 09:44:47 compute-1 ceph-mon[80107]: 9.17 scrub ok
Jan 26 09:44:47 compute-1 ceph-mon[80107]: osdmap e110: 3 total, 3 up, 3 in
Jan 26 09:44:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e111 e111: 3 total, 3 up, 3 in
Jan 26 09:44:47 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 111 pg[9.12( v 60'1159 (0'0,60'1159] local-lis/les=110/111 n=4 ec=63/48 lis/c=108/63 les/c/f=109/65/0 sis=110) [1] r=0 lpr=110 pi=[63,110)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:44:48 compute-1 sudo[89855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlckgtbebxmrnmeiajkwkpkzakgsckhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420688.0782278-125-781261997404/AnsiballZ_stat.py'
Jan 26 09:44:48 compute-1 sudo[89855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:44:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:48.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:48 compute-1 python3.9[89857]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:44:48 compute-1 sudo[89855]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:48 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:48 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488003880 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:48 compute-1 ceph-mon[80107]: osdmap e111: 3 total, 3 up, 3 in
Jan 26 09:44:48 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:44:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:48.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:48 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:49 compute-1 sudo[90010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zswxfxcyfotydrpjhgkbjjkadkdmbpnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420689.0806832-158-176075768510783/AnsiballZ_file.py'
Jan 26 09:44:49 compute-1 sudo[90010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:44:49 compute-1 python3.9[90012]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:44:49 compute-1 sudo[90010]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:49 compute-1 ceph-mon[80107]: pgmap v38: 353 pgs: 1 remapped+peering, 1 peering, 351 active+clean; 457 KiB data, 152 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:44:50 compute-1 sudo[90162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhrjyptqwsymvxjccslzemwflvddwabc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420690.0908616-185-234749104983281/AnsiballZ_file.py'
Jan 26 09:44:50 compute-1 sudo[90162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:44:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:50.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:50 compute-1 python3.9[90164]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:44:50 compute-1 sudo[90162]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:50 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:44:50 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:50 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:50 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:50 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14940037a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:50.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:50 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:50 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488003880 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:50 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e112 e112: 3 total, 3 up, 3 in
Jan 26 09:44:50 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Jan 26 09:44:50 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:50 compute-1 sudo[90241]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 09:44:50 compute-1 sudo[90241]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:51 compute-1 sudo[90241]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:51 compute-1 python3.9[90339]: ansible-ansible.builtin.service_facts Invoked
Jan 26 09:44:51 compute-1 network[90356]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 09:44:51 compute-1 network[90357]: 'network-scripts' will be removed from distribution in near future.
Jan 26 09:44:51 compute-1 network[90358]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 09:44:51 compute-1 ceph-mon[80107]: pgmap v39: 353 pgs: 353 active+clean; 457 KiB data, 152 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:44:51 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Jan 26 09:44:51 compute-1 ceph-mon[80107]: osdmap e112: 3 total, 3 up, 3 in
Jan 26 09:44:51 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:44:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:52.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:52 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:52 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:52.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:52 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14940037c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e113 e113: 3 total, 3 up, 3 in
Jan 26 09:44:52 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 113 pg[9.15( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=78/78 les/c/f=79/79/0 sis=113) [1] r=0 lpr=113 pi=[78,113)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:44:52 compute-1 ceph-mon[80107]: 9.1c scrub starts
Jan 26 09:44:52 compute-1 ceph-mon[80107]: 9.1c scrub ok
Jan 26 09:44:52 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Jan 26 09:44:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094453 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:44:53 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e114 e114: 3 total, 3 up, 3 in
Jan 26 09:44:53 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 114 pg[9.15( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=78/78 les/c/f=79/79/0 sis=114) [1]/[2] r=-1 lpr=114 pi=[78,114)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:53 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 114 pg[9.15( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=78/78 les/c/f=79/79/0 sis=114) [1]/[2] r=-1 lpr=114 pi=[78,114)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 09:44:53 compute-1 ceph-mon[80107]: pgmap v41: 353 pgs: 353 active+clean; 457 KiB data, 152 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:44:53 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Jan 26 09:44:53 compute-1 ceph-mon[80107]: osdmap e113: 3 total, 3 up, 3 in
Jan 26 09:44:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:54.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:54 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488003880 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:54 compute-1 sudo[90469]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:44:54 compute-1 sudo[90469]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:44:54 compute-1 sudo[90469]: pam_unix(sudo:session): session closed for user root
Jan 26 09:44:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:54 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:54.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:54 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:54 compute-1 ceph-mon[80107]: osdmap e114: 3 total, 3 up, 3 in
Jan 26 09:44:54 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Jan 26 09:44:54 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e115 e115: 3 total, 3 up, 3 in
Jan 26 09:44:54 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 115 pg[9.16( v 60'1159 (0'0,60'1159] local-lis/les=79/80 n=4 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=115 pruub=13.068075180s) [2] r=-1 lpr=115 pi=[79,115)/1 crt=60'1159 mlcod 0'0 active pruub 286.115386963s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:54 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 115 pg[9.16( v 60'1159 (0'0,60'1159] local-lis/les=79/80 n=4 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=115 pruub=13.068030357s) [2] r=-1 lpr=115 pi=[79,115)/1 crt=60'1159 mlcod 0'0 unknown NOTIFY pruub 286.115386963s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:44:55 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:44:55 compute-1 ceph-mon[80107]: pgmap v44: 353 pgs: 353 active+clean; 457 KiB data, 152 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:44:55 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Jan 26 09:44:55 compute-1 ceph-mon[80107]: osdmap e115: 3 total, 3 up, 3 in
Jan 26 09:44:55 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e116 e116: 3 total, 3 up, 3 in
Jan 26 09:44:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 116 pg[9.15( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=4 ec=63/48 lis/c=114/78 les/c/f=115/79/0 sis=116) [1] r=0 lpr=116 pi=[78,116)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 116 pg[9.15( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=4 ec=63/48 lis/c=114/78 les/c/f=115/79/0 sis=116) [1] r=0 lpr=116 pi=[78,116)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 09:44:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 116 pg[9.16( v 60'1159 (0'0,60'1159] local-lis/les=79/80 n=4 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=116) [2]/[1] r=0 lpr=116 pi=[79,116)/1 crt=60'1159 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:55 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 116 pg[9.16( v 60'1159 (0'0,60'1159] local-lis/les=79/80 n=4 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=116) [2]/[1] r=0 lpr=116 pi=[79,116)/1 crt=60'1159 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 26 09:44:55 compute-1 sshd-session[90451]: Connection closed by authenticating user root 178.62.249.31 port 51542 [preauth]
Jan 26 09:44:56 compute-1 python3.9[90648]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:44:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:56.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:56 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14940037e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:56 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488003880 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:56.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:56 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:56 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e117 e117: 3 total, 3 up, 3 in
Jan 26 09:44:56 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 117 pg[9.15( v 60'1159 (0'0,60'1159] local-lis/les=116/117 n=4 ec=63/48 lis/c=114/78 les/c/f=115/79/0 sis=116) [1] r=0 lpr=116 pi=[78,116)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:44:56 compute-1 ceph-mon[80107]: osdmap e116: 3 total, 3 up, 3 in
Jan 26 09:44:57 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 117 pg[9.16( v 60'1159 (0'0,60'1159] local-lis/les=116/117 n=4 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=116) [2]/[1] async=[2] r=0 lpr=116 pi=[79,116)/1 crt=60'1159 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:44:57 compute-1 python3.9[90798]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:44:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e118 e118: 3 total, 3 up, 3 in
Jan 26 09:44:57 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 118 pg[9.16( v 60'1159 (0'0,60'1159] local-lis/les=116/117 n=4 ec=63/48 lis/c=116/79 les/c/f=117/80/0 sis=118 pruub=15.382803917s) [2] async=[2] r=-1 lpr=118 pi=[79,118)/1 crt=60'1159 mlcod 60'1159 active pruub 291.080383301s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:44:57 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 118 pg[9.16( v 60'1159 (0'0,60'1159] local-lis/les=116/117 n=4 ec=63/48 lis/c=116/79 les/c/f=117/80/0 sis=118 pruub=15.382746696s) [2] r=-1 lpr=118 pi=[79,118)/1 crt=60'1159 mlcod 0'0 unknown NOTIFY pruub 291.080383301s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:44:58 compute-1 ceph-mon[80107]: pgmap v47: 353 pgs: 1 remapped+peering, 352 active+clean; 457 KiB data, 152 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:44:58 compute-1 ceph-mon[80107]: osdmap e117: 3 total, 3 up, 3 in
Jan 26 09:44:58 compute-1 ceph-mon[80107]: osdmap e118: 3 total, 3 up, 3 in
Jan 26 09:44:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:44:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:58.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:44:58 compute-1 sshd-session[90828]: Connection closed by authenticating user root 152.42.136.110 port 38264 [preauth]
Jan 26 09:44:58 compute-1 python3.9[90955]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:44:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e119 e119: 3 total, 3 up, 3 in
Jan 26 09:44:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:58 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:58 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003800 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:44:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:44:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:58.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:44:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:58 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003800 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:44:59 compute-1 sudo[91111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qksgjfogeylqdftqxqutfveyedcmiize ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420699.1802216-329-65856792651990/AnsiballZ_setup.py'
Jan 26 09:44:59 compute-1 sudo[91111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:44:59 compute-1 ceph-mon[80107]: pgmap v50: 353 pgs: 1 remapped+peering, 352 active+clean; 457 KiB data, 152 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:44:59 compute-1 ceph-mon[80107]: osdmap e119: 3 total, 3 up, 3 in
Jan 26 09:44:59 compute-1 python3.9[91114]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 09:44:59 compute-1 sudo[91111]: pam_unix(sudo:session): session closed for user root
Jan 26 09:45:00 compute-1 sudo[91196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzelectgztkapsdpgejpftsmybdzlnlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420699.1802216-329-65856792651990/AnsiballZ_dnf.py'
Jan 26 09:45:00 compute-1 sudo[91196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:45:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:00.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:00 compute-1 python3.9[91198]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:45:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:00 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:00 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e119 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:45:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:00 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:00 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Jan 26 09:45:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:45:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:00.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:45:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:00 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a4001320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:00 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e120 e120: 3 total, 3 up, 3 in
Jan 26 09:45:01 compute-1 ceph-mon[80107]: pgmap v52: 353 pgs: 353 active+clean; 457 KiB data, 152 MiB used, 60 GiB / 60 GiB avail; 447 B/s rd, 0 op/s; 48 B/s, 1 objects/s recovering
Jan 26 09:45:01 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Jan 26 09:45:01 compute-1 ceph-mon[80107]: osdmap e120: 3 total, 3 up, 3 in
Jan 26 09:45:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:02.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488003880 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:02 compute-1 sshd-session[91248]: Invalid user admin from 104.248.198.71 port 58202
Jan 26 09:45:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:02.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:02 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Jan 26 09:45:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e121 e121: 3 total, 3 up, 3 in
Jan 26 09:45:03 compute-1 sshd-session[91248]: Connection closed by invalid user admin 104.248.198.71 port 58202 [preauth]
Jan 26 09:45:03 compute-1 ceph-mon[80107]: pgmap v54: 353 pgs: 353 active+clean; 457 KiB data, 152 MiB used, 60 GiB / 60 GiB avail; 367 B/s rd, 0 op/s; 39 B/s, 1 objects/s recovering
Jan 26 09:45:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Jan 26 09:45:03 compute-1 ceph-mon[80107]: osdmap e121: 3 total, 3 up, 3 in
Jan 26 09:45:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:45:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:04.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:04 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a4001320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:04 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488003880 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:04 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:04.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Jan 26 09:45:04 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e122 e122: 3 total, 3 up, 3 in
Jan 26 09:45:05 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e122 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 09:45:05 compute-1 ceph-mon[80107]: pgmap v56: 353 pgs: 353 active+clean; 457 KiB data, 152 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s; 36 B/s, 1 objects/s recovering
Jan 26 09:45:05 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Jan 26 09:45:05 compute-1 ceph-mon[80107]: osdmap e122: 3 total, 3 up, 3 in
Jan 26 09:45:05 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e123 e123: 3 total, 3 up, 3 in
Jan 26 09:45:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:06.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:06 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:06 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a4001320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:06 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14880041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 09:45:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:06.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 09:45:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e124 e124: 3 total, 3 up, 3 in
Jan 26 09:45:07 compute-1 ceph-mon[80107]: osdmap e123: 3 total, 3 up, 3 in
Jan 26 09:45:07 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Jan 26 09:45:07 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 124 pg[9.1a( v 60'1159 (0'0,60'1159] local-lis/les=89/90 n=4 ec=63/48 lis/c=89/89 les/c/f=90/90/0 sis=124 pruub=12.692170143s) [0] r=-1 lpr=124 pi=[89,124)/1 crt=60'1159 mlcod 0'0 active pruub 297.793975830s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:45:07 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 124 pg[9.1a( v 60'1159 (0'0,60'1159] local-lis/les=89/90 n=4 ec=63/48 lis/c=89/89 les/c/f=90/90/0 sis=124 pruub=12.692132950s) [0] r=-1 lpr=124 pi=[89,124)/1 crt=60'1159 mlcod 0'0 unknown NOTIFY pruub 297.793975830s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:45:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e125 e125: 3 total, 3 up, 3 in
Jan 26 09:45:07 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 125 pg[9.1a( v 60'1159 (0'0,60'1159] local-lis/les=89/90 n=4 ec=63/48 lis/c=89/89 les/c/f=90/90/0 sis=125) [0]/[1] r=0 lpr=125 pi=[89,125)/1 crt=60'1159 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:45:07 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 125 pg[9.1a( v 60'1159 (0'0,60'1159] local-lis/les=89/90 n=4 ec=63/48 lis/c=89/89 les/c/f=90/90/0 sis=125) [0]/[1] r=0 lpr=125 pi=[89,125)/1 crt=60'1159 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 26 09:45:08 compute-1 ceph-mon[80107]: pgmap v59: 353 pgs: 353 active+clean; 457 KiB data, 152 MiB used, 60 GiB / 60 GiB avail; 181 B/s rd, 0 op/s
Jan 26 09:45:08 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Jan 26 09:45:08 compute-1 ceph-mon[80107]: osdmap e124: 3 total, 3 up, 3 in
Jan 26 09:45:08 compute-1 ceph-mon[80107]: osdmap e125: 3 total, 3 up, 3 in
Jan 26 09:45:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:08.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:08 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e126 e126: 3 total, 3 up, 3 in
Jan 26 09:45:08 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 126 pg[9.1a( v 60'1159 (0'0,60'1159] local-lis/les=125/126 n=4 ec=63/48 lis/c=89/89 les/c/f=90/90/0 sis=125) [0]/[1] async=[0] r=0 lpr=125 pi=[89,125)/1 crt=60'1159 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:45:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:08 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478001230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:08 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:08 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a4002300 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:45:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:08.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:45:09 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Jan 26 09:45:09 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Jan 26 09:45:09 compute-1 ceph-mon[80107]: osdmap e126: 3 total, 3 up, 3 in
Jan 26 09:45:09 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e127 e127: 3 total, 3 up, 3 in
Jan 26 09:45:09 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 127 pg[9.1a( v 60'1159 (0'0,60'1159] local-lis/les=125/126 n=4 ec=63/48 lis/c=125/89 les/c/f=126/90/0 sis=127 pruub=14.990523338s) [0] async=[0] r=-1 lpr=127 pi=[89,127)/1 crt=60'1159 mlcod 60'1159 active pruub 302.718963623s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:45:09 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 127 pg[9.1a( v 60'1159 (0'0,60'1159] local-lis/les=125/126 n=4 ec=63/48 lis/c=125/89 les/c/f=126/90/0 sis=127 pruub=14.990057945s) [0] r=-1 lpr=127 pi=[89,127)/1 crt=60'1159 mlcod 0'0 unknown NOTIFY pruub 302.718963623s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:45:10 compute-1 ceph-mon[80107]: pgmap v62: 353 pgs: 353 active+clean; 457 KiB data, 152 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:45:10 compute-1 ceph-mon[80107]: osdmap e127: 3 total, 3 up, 3 in
Jan 26 09:45:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:10.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:10 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e128 e128: 3 total, 3 up, 3 in
Jan 26 09:45:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:10 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14880041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:10 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:45:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:10 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478001f50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:10 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:10.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:11 compute-1 ceph-mon[80107]: pgmap v65: 353 pgs: 1 unknown, 1 active+remapped, 351 active+clean; 457 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s; 54 B/s, 2 objects/s recovering
Jan 26 09:45:11 compute-1 ceph-mon[80107]: osdmap e128: 3 total, 3 up, 3 in
Jan 26 09:45:11 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e129 e129: 3 total, 3 up, 3 in
Jan 26 09:45:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:12.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:12 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a4002300 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:12 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14880041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:12 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14780020f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:12.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:12 compute-1 ceph-mon[80107]: osdmap e129: 3 total, 3 up, 3 in
Jan 26 09:45:13 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e130 e130: 3 total, 3 up, 3 in
Jan 26 09:45:14 compute-1 ceph-mon[80107]: pgmap v68: 353 pgs: 1 unknown, 1 active+remapped, 351 active+clean; 457 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s; 54 B/s, 2 objects/s recovering
Jan 26 09:45:14 compute-1 ceph-mon[80107]: osdmap e130: 3 total, 3 up, 3 in
Jan 26 09:45:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:45:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:14.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:45:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:14 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:14 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a4002300 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:14 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14880041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:45:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:14.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:45:14 compute-1 sudo[91283]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:45:14 compute-1 sudo[91283]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:45:14 compute-1 sudo[91283]: pam_unix(sudo:session): session closed for user root
Jan 26 09:45:15 compute-1 ceph-mon[80107]: pgmap v70: 353 pgs: 1 unknown, 1 active+remapped, 351 active+clean; 457 KiB data, 153 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:45:15 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:45:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:16.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:16 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e131 e131: 3 total, 3 up, 3 in
Jan 26 09:45:16 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Jan 26 09:45:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:16 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:16 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:16 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a4002300 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:16.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:17 compute-1 ceph-mon[80107]: pgmap v71: 353 pgs: 353 active+clean; 457 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s; 18 B/s, 0 objects/s recovering
Jan 26 09:45:17 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Jan 26 09:45:17 compute-1 ceph-mon[80107]: osdmap e131: 3 total, 3 up, 3 in
Jan 26 09:45:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:18.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Jan 26 09:45:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:45:18 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e132 e132: 3 total, 3 up, 3 in
Jan 26 09:45:18 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 132 pg[9.1d( v 60'1159 (0'0,60'1159] local-lis/les=98/99 n=5 ec=63/48 lis/c=98/98 les/c/f=99/99/0 sis=132 pruub=11.250358582s) [2] r=-1 lpr=132 pi=[98,132)/1 crt=60'1159 mlcod 0'0 active pruub 308.035430908s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:45:18 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 132 pg[9.1d( v 60'1159 (0'0,60'1159] local-lis/les=98/99 n=5 ec=63/48 lis/c=98/98 les/c/f=99/99/0 sis=132 pruub=11.250319481s) [2] r=-1 lpr=132 pi=[98,132)/1 crt=60'1159 mlcod 0'0 unknown NOTIFY pruub 308.035430908s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:45:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:18 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14880041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:18 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:18 compute-1 kernel: ganesha.nfsd[87014]: segfault at 50 ip 00007f152f46c32e sp 00007f14b5ffa210 error 4 in libntirpc.so.5.8[7f152f451000+2c000] likely on CPU 0 (core 0, socket 0)
Jan 26 09:45:18 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 26 09:45:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:18 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041c0 fd 48 proxy ignored for local
Jan 26 09:45:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:18.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:18 compute-1 systemd[1]: Created slice Slice /system/systemd-coredump.
Jan 26 09:45:18 compute-1 systemd[1]: Started Process Core Dump (PID 91352/UID 0).
Jan 26 09:45:19 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e133 e133: 3 total, 3 up, 3 in
Jan 26 09:45:19 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 133 pg[9.1d( v 60'1159 (0'0,60'1159] local-lis/les=98/99 n=5 ec=63/48 lis/c=98/98 les/c/f=99/99/0 sis=133) [2]/[1] r=0 lpr=133 pi=[98,133)/1 crt=60'1159 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:45:19 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 133 pg[9.1d( v 60'1159 (0'0,60'1159] local-lis/les=98/99 n=5 ec=63/48 lis/c=98/98 les/c/f=99/99/0 sis=133) [2]/[1] r=0 lpr=133 pi=[98,133)/1 crt=60'1159 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 26 09:45:19 compute-1 systemd-coredump[91353]: Process 86204 (ganesha.nfsd) of user 0 dumped core.
                                                   
                                                   Stack trace of thread 55:
                                                   #0  0x00007f152f46c32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                   ELF object binary architecture: AMD x86-64
Jan 26 09:45:19 compute-1 ceph-mon[80107]: pgmap v73: 353 pgs: 353 active+clean; 457 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 148 B/s rd, 0 op/s; 15 B/s, 0 objects/s recovering
Jan 26 09:45:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Jan 26 09:45:19 compute-1 ceph-mon[80107]: osdmap e132: 3 total, 3 up, 3 in
Jan 26 09:45:19 compute-1 systemd[1]: systemd-coredump@0-91352-0.service: Deactivated successfully.
Jan 26 09:45:19 compute-1 systemd[1]: systemd-coredump@0-91352-0.service: Consumed 1.002s CPU time.
Jan 26 09:45:19 compute-1 podman[91359]: 2026-01-26 09:45:19.975237784 +0000 UTC m=+0.025710335 container died 19c026de89e744ddc79168cf6d35f36da879cfc6e36f4542a44b0a0f6b664c36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, OSD_FLAVOR=default)
Jan 26 09:45:19 compute-1 systemd[1]: var-lib-containers-storage-overlay-31ac196e154da090dbbc83357691609c1e8abfb8268037421008e730e1429aee-merged.mount: Deactivated successfully.
Jan 26 09:45:20 compute-1 podman[91359]: 2026-01-26 09:45:20.011432283 +0000 UTC m=+0.061904814 container remove 19c026de89e744ddc79168cf6d35f36da879cfc6e36f4542a44b0a0f6b664c36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 09:45:20 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Main process exited, code=exited, status=139/n/a
Jan 26 09:45:20 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Failed with result 'exit-code'.
Jan 26 09:45:20 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.483s CPU time.
Jan 26 09:45:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:20.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:20 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:45:20 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e134 e134: 3 total, 3 up, 3 in
Jan 26 09:45:20 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 134 pg[9.1d( v 60'1159 (0'0,60'1159] local-lis/les=133/134 n=5 ec=63/48 lis/c=98/98 les/c/f=99/99/0 sis=133) [2]/[1] async=[2] r=0 lpr=133 pi=[98,133)/1 crt=60'1159 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:45:20 compute-1 ceph-mon[80107]: osdmap e133: 3 total, 3 up, 3 in
Jan 26 09:45:20 compute-1 ceph-mon[80107]: osdmap e134: 3 total, 3 up, 3 in
Jan 26 09:45:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:20.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:21 compute-1 ceph-mon[80107]: pgmap v76: 353 pgs: 1 unknown, 352 active+clean; 457 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s; 18 B/s, 0 objects/s recovering
Jan 26 09:45:21 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e135 e135: 3 total, 3 up, 3 in
Jan 26 09:45:21 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 135 pg[9.1d( v 60'1159 (0'0,60'1159] local-lis/les=133/134 n=5 ec=63/48 lis/c=133/98 les/c/f=134/99/0 sis=135 pruub=14.956477165s) [2] async=[2] r=-1 lpr=135 pi=[98,135)/1 crt=60'1159 mlcod 60'1159 active pruub 314.923736572s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:45:21 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 135 pg[9.1d( v 60'1159 (0'0,60'1159] local-lis/les=133/134 n=5 ec=63/48 lis/c=133/98 les/c/f=134/99/0 sis=135 pruub=14.956420898s) [2] r=-1 lpr=135 pi=[98,135)/1 crt=60'1159 mlcod 0'0 unknown NOTIFY pruub 314.923736572s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:45:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:45:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:22.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:45:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:22.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e136 e136: 3 total, 3 up, 3 in
Jan 26 09:45:22 compute-1 ceph-mon[80107]: osdmap e135: 3 total, 3 up, 3 in
Jan 26 09:45:22 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Jan 26 09:45:22 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:22.946405) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 09:45:22 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Jan 26 09:45:22 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420722946448, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 3213, "num_deletes": 251, "total_data_size": 9968564, "memory_usage": 10224976, "flush_reason": "Manual Compaction"}
Jan 26 09:45:22 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Jan 26 09:45:22 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420722982459, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 6284817, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7369, "largest_seqno": 10577, "table_properties": {"data_size": 6270463, "index_size": 9248, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 4037, "raw_key_size": 36445, "raw_average_key_size": 22, "raw_value_size": 6239048, "raw_average_value_size": 3914, "num_data_blocks": 402, "num_entries": 1594, "num_filter_entries": 1594, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420622, "oldest_key_time": 1769420622, "file_creation_time": 1769420722, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Jan 26 09:45:22 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 36102 microseconds, and 10429 cpu microseconds.
Jan 26 09:45:22 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 09:45:22 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:22.982508) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 6284817 bytes OK
Jan 26 09:45:22 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:22.982531) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Jan 26 09:45:22 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:22.985011) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Jan 26 09:45:22 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:22.985050) EVENT_LOG_v1 {"time_micros": 1769420722985045, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 09:45:22 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:22.985077) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 09:45:22 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 9952950, prev total WAL file size 9952950, number of live WAL files 2.
Jan 26 09:45:22 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 09:45:22 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:22.987358) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Jan 26 09:45:22 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 09:45:22 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(6137KB)], [18(10MB)]
Jan 26 09:45:22 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420722987441, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 17228625, "oldest_snapshot_seqno": -1}
Jan 26 09:45:23 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 4056 keys, 14807031 bytes, temperature: kUnknown
Jan 26 09:45:23 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420723076571, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 14807031, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14774547, "index_size": 21238, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10181, "raw_key_size": 103546, "raw_average_key_size": 25, "raw_value_size": 14694884, "raw_average_value_size": 3622, "num_data_blocks": 914, "num_entries": 4056, "num_filter_entries": 4056, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769420722, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Jan 26 09:45:23 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 09:45:23 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:23.076844) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 14807031 bytes
Jan 26 09:45:23 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:23.089511) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 193.1 rd, 166.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(6.0, 10.4 +0.0 blob) out(14.1 +0.0 blob), read-write-amplify(5.1) write-amplify(2.4) OK, records in: 4594, records dropped: 538 output_compression: NoCompression
Jan 26 09:45:23 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:23.089555) EVENT_LOG_v1 {"time_micros": 1769420723089537, "job": 8, "event": "compaction_finished", "compaction_time_micros": 89207, "compaction_time_cpu_micros": 32678, "output_level": 6, "num_output_files": 1, "total_output_size": 14807031, "num_input_records": 4594, "num_output_records": 4056, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 09:45:23 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 09:45:23 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420723091379, "job": 8, "event": "table_file_deletion", "file_number": 20}
Jan 26 09:45:23 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 09:45:23 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420723094079, "job": 8, "event": "table_file_deletion", "file_number": 18}
Jan 26 09:45:23 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:22.987182) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:45:23 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:23.094198) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:45:23 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:23.094205) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:45:23 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:23.094207) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:45:23 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:23.094209) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:45:23 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:23.094211) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:45:23 compute-1 ceph-mon[80107]: pgmap v79: 353 pgs: 1 unknown, 352 active+clean; 457 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 26 09:45:23 compute-1 ceph-mon[80107]: osdmap e136: 3 total, 3 up, 3 in
Jan 26 09:45:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:24.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094524 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:45:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:24.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:25 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:45:25 compute-1 ceph-mon[80107]: pgmap v81: 353 pgs: 1 unknown, 352 active+clean; 457 KiB data, 153 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:45:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:26.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:26.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:27 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Jan 26 09:45:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e137 e137: 3 total, 3 up, 3 in
Jan 26 09:45:27 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 137 pg[9.1e( v 60'1159 (0'0,60'1159] local-lis/les=79/80 n=5 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=137 pruub=13.013389587s) [0] r=-1 lpr=137 pi=[79,137)/1 crt=60'1159 mlcod 0'0 active pruub 318.115966797s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:45:27 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 137 pg[9.1e( v 60'1159 (0'0,60'1159] local-lis/les=79/80 n=5 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=137 pruub=13.013346672s) [0] r=-1 lpr=137 pi=[79,137)/1 crt=60'1159 mlcod 0'0 unknown NOTIFY pruub 318.115966797s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:45:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e138 e138: 3 total, 3 up, 3 in
Jan 26 09:45:27 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 138 pg[9.1e( v 60'1159 (0'0,60'1159] local-lis/les=79/80 n=5 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=138) [0]/[1] r=0 lpr=138 pi=[79,138)/1 crt=60'1159 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:45:27 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 138 pg[9.1e( v 60'1159 (0'0,60'1159] local-lis/les=79/80 n=5 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=138) [0]/[1] r=0 lpr=138 pi=[79,138)/1 crt=60'1159 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 26 09:45:28 compute-1 ceph-mon[80107]: pgmap v82: 353 pgs: 353 active+clean; 457 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s; 36 B/s, 0 objects/s recovering
Jan 26 09:45:28 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Jan 26 09:45:28 compute-1 ceph-mon[80107]: osdmap e137: 3 total, 3 up, 3 in
Jan 26 09:45:28 compute-1 ceph-mon[80107]: osdmap e138: 3 total, 3 up, 3 in
Jan 26 09:45:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:28.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:28 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e139 e139: 3 total, 3 up, 3 in
Jan 26 09:45:28 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 139 pg[9.1f( v 60'1159 (0'0,60'1159] local-lis/les=101/102 n=5 ec=63/48 lis/c=101/101 les/c/f=102/102/0 sis=139 pruub=12.346158981s) [0] r=-1 lpr=139 pi=[101,139)/1 crt=60'1159 mlcod 0'0 active pruub 319.068176270s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:45:28 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 139 pg[9.1f( v 60'1159 (0'0,60'1159] local-lis/les=101/102 n=5 ec=63/48 lis/c=101/101 les/c/f=102/102/0 sis=139 pruub=12.346119881s) [0] r=-1 lpr=139 pi=[101,139)/1 crt=60'1159 mlcod 0'0 unknown NOTIFY pruub 319.068176270s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:45:28 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 139 pg[9.1e( v 60'1159 (0'0,60'1159] local-lis/les=138/139 n=5 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=138) [0]/[1] async=[0] r=0 lpr=138 pi=[79,138)/1 crt=60'1159 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:45:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:28.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:29 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 09:45:29 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 09:45:29 compute-1 ceph-mon[80107]: osdmap e139: 3 total, 3 up, 3 in
Jan 26 09:45:29 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e140 e140: 3 total, 3 up, 3 in
Jan 26 09:45:29 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 140 pg[9.1f( v 60'1159 (0'0,60'1159] local-lis/les=101/102 n=5 ec=63/48 lis/c=101/101 les/c/f=102/102/0 sis=140) [0]/[1] r=0 lpr=140 pi=[101,140)/1 crt=60'1159 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:45:29 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 140 pg[9.1e( v 60'1159 (0'0,60'1159] local-lis/les=138/139 n=5 ec=63/48 lis/c=138/79 les/c/f=139/80/0 sis=140 pruub=15.001749039s) [0] async=[0] r=-1 lpr=140 pi=[79,140)/1 crt=60'1159 mlcod 60'1159 active pruub 322.724273682s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:45:29 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 140 pg[9.1f( v 60'1159 (0'0,60'1159] local-lis/les=101/102 n=5 ec=63/48 lis/c=101/101 les/c/f=102/102/0 sis=140) [0]/[1] r=0 lpr=140 pi=[101,140)/1 crt=60'1159 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 26 09:45:29 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 140 pg[9.1e( v 60'1159 (0'0,60'1159] local-lis/les=138/139 n=5 ec=63/48 lis/c=138/79 les/c/f=139/80/0 sis=140 pruub=15.001703262s) [0] r=-1 lpr=140 pi=[79,140)/1 crt=60'1159 mlcod 0'0 unknown NOTIFY pruub 322.724273682s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:45:30 compute-1 ceph-mon[80107]: pgmap v85: 353 pgs: 353 active+clean; 457 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s; 36 B/s, 0 objects/s recovering
Jan 26 09:45:30 compute-1 ceph-mon[80107]: osdmap e140: 3 total, 3 up, 3 in
Jan 26 09:45:30 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Scheduled restart job, restart counter is at 1.
Jan 26 09:45:30 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:45:30 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.483s CPU time.
Jan 26 09:45:30 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 09:45:30 compute-1 podman[91485]: 2026-01-26 09:45:30.394046351 +0000 UTC m=+0.021084965 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:45:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:45:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:30.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:45:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:45:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:30.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:45:31 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:45:31 compute-1 podman[91485]: 2026-01-26 09:45:31.192942454 +0000 UTC m=+0.819981038 container create f32869ac6ab743e2b48608e16b6d5e4d055ccee739772d437fabcd66b742e07f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Jan 26 09:45:31 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e141 e141: 3 total, 3 up, 3 in
Jan 26 09:45:31 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 141 pg[9.1f( v 60'1159 (0'0,60'1159] local-lis/les=140/141 n=5 ec=63/48 lis/c=101/101 les/c/f=102/102/0 sis=140) [0]/[1] async=[0] r=0 lpr=140 pi=[101,140)/1 crt=60'1159 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 09:45:31 compute-1 ceph-mon[80107]: pgmap v88: 353 pgs: 1 unknown, 1 active+remapped, 351 active+clean; 457 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:45:31 compute-1 ceph-mon[80107]: osdmap e141: 3 total, 3 up, 3 in
Jan 26 09:45:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/acaa23a82ff129fe08b8ca585e5004a757ed19363666aee393d8ce5d14943f83/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 09:45:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/acaa23a82ff129fe08b8ca585e5004a757ed19363666aee393d8ce5d14943f83/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 09:45:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/acaa23a82ff129fe08b8ca585e5004a757ed19363666aee393d8ce5d14943f83/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 09:45:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/acaa23a82ff129fe08b8ca585e5004a757ed19363666aee393d8ce5d14943f83/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 09:45:31 compute-1 podman[91485]: 2026-01-26 09:45:31.396696531 +0000 UTC m=+1.023735145 container init f32869ac6ab743e2b48608e16b6d5e4d055ccee739772d437fabcd66b742e07f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325)
Jan 26 09:45:31 compute-1 podman[91485]: 2026-01-26 09:45:31.401690311 +0000 UTC m=+1.028728905 container start f32869ac6ab743e2b48608e16b6d5e4d055ccee739772d437fabcd66b742e07f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 09:45:31 compute-1 bash[91485]: f32869ac6ab743e2b48608e16b6d5e4d055ccee739772d437fabcd66b742e07f
Jan 26 09:45:31 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:45:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:31 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 09:45:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:31 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 09:45:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:31 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 26 09:45:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:31 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 26 09:45:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:31 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 26 09:45:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:31 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 26 09:45:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:31 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 26 09:45:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:31 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:45:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e142 e142: 3 total, 3 up, 3 in
Jan 26 09:45:32 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 142 pg[9.1f( v 60'1159 (0'0,60'1159] local-lis/les=140/141 n=5 ec=63/48 lis/c=140/101 les/c/f=141/102/0 sis=142 pruub=14.966416359s) [0] async=[0] r=-1 lpr=142 pi=[101,142)/1 crt=60'1159 mlcod 60'1159 active pruub 325.436614990s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 09:45:32 compute-1 ceph-osd[77632]: osd.1 pg_epoch: 142 pg[9.1f( v 60'1159 (0'0,60'1159] local-lis/les=140/141 n=5 ec=63/48 lis/c=140/101 les/c/f=141/102/0 sis=142 pruub=14.966349602s) [0] r=-1 lpr=142 pi=[101,142)/1 crt=60'1159 mlcod 0'0 unknown NOTIFY pruub 325.436614990s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 09:45:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:32.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:32.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:33 compute-1 ceph-mon[80107]: osdmap e142: 3 total, 3 up, 3 in
Jan 26 09:45:33 compute-1 ceph-mon[80107]: pgmap v91: 353 pgs: 1 unknown, 1 active+remapped, 351 active+clean; 457 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:45:33 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 e143: 3 total, 3 up, 3 in
Jan 26 09:45:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:34.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:45:34 compute-1 ceph-mon[80107]: osdmap e143: 3 total, 3 up, 3 in
Jan 26 09:45:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:45:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:34.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:45:34 compute-1 sudo[91544]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:45:34 compute-1 sudo[91544]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:45:34 compute-1 sudo[91544]: pam_unix(sudo:session): session closed for user root
Jan 26 09:45:35 compute-1 ceph-mon[80107]: pgmap v93: 353 pgs: 1 unknown, 1 active+remapped, 351 active+clean; 457 KiB data, 153 MiB used, 60 GiB / 60 GiB avail
Jan 26 09:45:36 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:45:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:36.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:45:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:36.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:45:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:37 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:45:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:37 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:45:37 compute-1 ceph-mon[80107]: pgmap v94: 353 pgs: 353 active+clean; 457 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 170 B/s wr, 1 op/s; 36 B/s, 1 objects/s recovering
Jan 26 09:45:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:38.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:45:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:38.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:45:39 compute-1 ceph-mon[80107]: pgmap v95: 353 pgs: 353 active+clean; 457 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 830 B/s rd, 138 B/s wr, 1 op/s; 29 B/s, 1 objects/s recovering
Jan 26 09:45:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:40.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:45:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:40.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:45:41 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:45:41 compute-1 ceph-mon[80107]: pgmap v96: 353 pgs: 353 active+clean; 457 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.3 KiB/s rd, 1.3 KiB/s wr, 4 op/s; 26 B/s, 1 objects/s recovering
Jan 26 09:45:42 compute-1 sshd-session[91573]: Connection closed by authenticating user root 178.62.249.31 port 42990 [preauth]
Jan 26 09:45:42 compute-1 sshd-session[91576]: Connection closed by authenticating user root 152.42.136.110 port 57954 [preauth]
Jan 26 09:45:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:45:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:42.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:45:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:42.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:43 compute-1 sudo[91196]: pam_unix(sudo:session): session closed for user root
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 26 09:45:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 09:45:43 compute-1 sudo[91740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbywhorkrrlthdvtnmlbddxoyjynulgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420743.589738-365-205942072461173/AnsiballZ_command.py'
Jan 26 09:45:43 compute-1 sudo[91740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:45:43 compute-1 ceph-mon[80107]: pgmap v97: 353 pgs: 353 active+clean; 457 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1.1 KiB/s wr, 3 op/s; 21 B/s, 0 objects/s recovering
Jan 26 09:45:44 compute-1 python3.9[91742]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:45:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:44.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:44 compute-1 sudo[91740]: pam_unix(sudo:session): session closed for user root
Jan 26 09:45:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:44 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba8000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:44 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:44 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:44.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:45 compute-1 sudo[92032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohahzvhtmymxhdbqbmnotjhisyehmtqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420745.1330638-389-53820845218469/AnsiballZ_selinux.py'
Jan 26 09:45:45 compute-1 sudo[92032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:45:46 compute-1 ceph-mon[80107]: pgmap v98: 353 pgs: 353 active+clean; 457 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1.0 KiB/s wr, 3 op/s; 20 B/s, 0 objects/s recovering
Jan 26 09:45:46 compute-1 python3.9[92034]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 26 09:45:46 compute-1 sudo[92032]: pam_unix(sudo:session): session closed for user root
Jan 26 09:45:46 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:45:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:45:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:46.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:45:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:46 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:46 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094546 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:45:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:46 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:46 compute-1 sudo[92184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftivhzfzqwdzermsfjbigsbsdxfdilxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420746.5711436-422-56072685368336/AnsiballZ_command.py'
Jan 26 09:45:46 compute-1 sudo[92184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:45:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:46.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:47 compute-1 python3.9[92186]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 26 09:45:47 compute-1 sudo[92184]: pam_unix(sudo:session): session closed for user root
Jan 26 09:45:47 compute-1 sudo[92337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzwxoztapnybotdjbvhhlvenyxwapmta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420747.330573-446-79664491877451/AnsiballZ_file.py'
Jan 26 09:45:47 compute-1 sudo[92337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:45:47 compute-1 python3.9[92339]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:45:47 compute-1 sudo[92337]: pam_unix(sudo:session): session closed for user root
Jan 26 09:45:48 compute-1 ceph-mon[80107]: pgmap v99: 353 pgs: 353 active+clean; 457 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1023 B/s wr, 3 op/s; 18 B/s, 0 objects/s recovering
Jan 26 09:45:48 compute-1 sudo[92489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnnhlcqanurlxdrxjtjkdgyspbpykzep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420747.9934394-470-187522108466443/AnsiballZ_mount.py'
Jan 26 09:45:48 compute-1 sudo[92489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:45:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:45:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:48.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:45:48 compute-1 python3.9[92491]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 26 09:45:48 compute-1 sudo[92489]: pam_unix(sudo:session): session closed for user root
Jan 26 09:45:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:48 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b900016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:48 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b840016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:48 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:48.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:45:49 compute-1 sudo[92642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bovqhyaytpfnbstdijxlvkiwtcjvnozo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420749.6712644-554-96008959505197/AnsiballZ_file.py'
Jan 26 09:45:49 compute-1 sudo[92642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:45:50 compute-1 ceph-mon[80107]: pgmap v100: 353 pgs: 353 active+clean; 457 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 09:45:50 compute-1 python3.9[92644]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:45:50 compute-1 sudo[92642]: pam_unix(sudo:session): session closed for user root
Jan 26 09:45:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:50.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:50 compute-1 sudo[92794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iibixlndpivvwzncoiziyrkyasntrexh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420750.3973987-579-46146923524820/AnsiballZ_stat.py'
Jan 26 09:45:50 compute-1 sudo[92794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:45:50 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:50 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:50 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:50 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:50 compute-1 python3.9[92796]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:45:50 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:50 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b840016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:50 compute-1 sudo[92794]: pam_unix(sudo:session): session closed for user root
Jan 26 09:45:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:45:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:50.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:45:51 compute-1 sudo[92872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skayyxyssvrqbbdxgdljvickipcmluty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420750.3973987-579-46146923524820/AnsiballZ_file.py'
Jan 26 09:45:51 compute-1 sudo[92872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:45:51 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:45:51 compute-1 sudo[92875]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:45:51 compute-1 sudo[92875]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:45:51 compute-1 sudo[92875]: pam_unix(sudo:session): session closed for user root
Jan 26 09:45:51 compute-1 sudo[92900]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 09:45:51 compute-1 sudo[92900]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:45:51 compute-1 python3.9[92874]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:45:51 compute-1 sudo[92872]: pam_unix(sudo:session): session closed for user root
Jan 26 09:45:51 compute-1 sudo[92900]: pam_unix(sudo:session): session closed for user root
Jan 26 09:45:52 compute-1 ceph-mon[80107]: pgmap v101: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 1.2 KiB/s wr, 3 op/s
Jan 26 09:45:52 compute-1 ceph-mon[80107]: mgrmap e32: compute-0.zllcia(active, since 92s), standbys: compute-1.xammti, compute-2.oynaeu
Jan 26 09:45:52 compute-1 sudo[93108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxiejukwfcdkanhpzoymsuifnwuyoygr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420752.2313912-641-54190592845135/AnsiballZ_stat.py'
Jan 26 09:45:52 compute-1 sudo[93108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:45:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:45:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:52.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:45:52 compute-1 python3.9[93110]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:45:52 compute-1 sudo[93108]: pam_unix(sudo:session): session closed for user root
Jan 26 09:45:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:52 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c0023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:52 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:52 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:52.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:53 compute-1 ceph-mon[80107]: pgmap v102: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 341 B/s wr, 0 op/s
Jan 26 09:45:53 compute-1 sudo[93263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdhvycskzdkdczfwvmfliygbyedvpyko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420753.381877-680-102915575758318/AnsiballZ_getent.py'
Jan 26 09:45:53 compute-1 sudo[93263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:45:54 compute-1 python3.9[93265]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 26 09:45:54 compute-1 sudo[93263]: pam_unix(sudo:session): session closed for user root
Jan 26 09:45:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:54.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:54 compute-1 sudo[93416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtybodizwuernzdtuyanpawkidloczlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420754.4158795-710-76759969035897/AnsiballZ_getent.py'
Jan 26 09:45:54 compute-1 sudo[93416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:45:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:54 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b840016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:54 compute-1 python3.9[93418]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 26 09:45:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:54 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c0023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:54 compute-1 sudo[93416]: pam_unix(sudo:session): session closed for user root
Jan 26 09:45:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:54 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:54.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:55 compute-1 sudo[93444]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:45:55 compute-1 sudo[93444]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:45:55 compute-1 sudo[93444]: pam_unix(sudo:session): session closed for user root
Jan 26 09:45:55 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:45:55 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:45:55 compute-1 ceph-mon[80107]: pgmap v103: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 341 B/s wr, 0 op/s
Jan 26 09:45:55 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:45:55 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 09:45:55 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:45:55 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:45:55 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 09:45:55 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 09:45:55 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:45:55 compute-1 sudo[93595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sseynltbuktyzjddlvvqhlkxksxevogs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420755.130421-734-43495250967484/AnsiballZ_group.py'
Jan 26 09:45:55 compute-1 sudo[93595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:45:56 compute-1 python3.9[93597]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 09:45:56 compute-1 sudo[93595]: pam_unix(sudo:session): session closed for user root
Jan 26 09:45:56 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:45:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:45:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:56.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:45:56 compute-1 sudo[93749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prxcdamupnxvrbfikzfdsbxunibhjslr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420756.3772662-761-220902295781616/AnsiballZ_file.py'
Jan 26 09:45:56 compute-1 sudo[93749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:45:56 compute-1 python3.9[93751]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 26 09:45:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:56 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:56 compute-1 sudo[93749]: pam_unix(sudo:session): session closed for user root
Jan 26 09:45:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:56 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:56 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c0023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:56.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:57 compute-1 sshd-session[93622]: Invalid user admin from 104.248.198.71 port 34674
Jan 26 09:45:57 compute-1 sshd-session[93622]: Connection closed by invalid user admin 104.248.198.71 port 34674 [preauth]
Jan 26 09:45:57 compute-1 sudo[93902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmcimshalgaafkolxwkfnxouoerixsae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420757.3798788-794-261362666085092/AnsiballZ_dnf.py'
Jan 26 09:45:57 compute-1 sudo[93902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:45:57 compute-1 python3.9[93904]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:45:58 compute-1 ceph-mon[80107]: pgmap v104: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 341 B/s wr, 0 op/s
Jan 26 09:45:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:58.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:58 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:58 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:58 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:45:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:45:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:45:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:58.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:45:59 compute-1 ceph-mon[80107]: pgmap v105: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 255 B/s wr, 0 op/s
Jan 26 09:45:59 compute-1 sudo[93902]: pam_unix(sudo:session): session closed for user root
Jan 26 09:45:59 compute-1 sudo[94056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvxmdubmsrgxqhsvkwiioblcpqggnjui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420759.66134-818-230975466803691/AnsiballZ_file.py'
Jan 26 09:45:59 compute-1 sudo[94056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:46:00 compute-1 python3.9[94058]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:46:00 compute-1 sudo[94056]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:00 compute-1 sudo[94106]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 09:46:00 compute-1 sudo[94106]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:46:00 compute-1 sudo[94106]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:00.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:00 compute-1 sudo[94233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzucdbmsuwvkvdpnbcimlhebkeqkvolg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420760.4524548-842-15197753659588/AnsiballZ_stat.py'
Jan 26 09:46:00 compute-1 sudo[94233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:46:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:00 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:00 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b900032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:00 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:00 compute-1 python3.9[94235]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:46:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:46:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:00.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:46:00 compute-1 sudo[94233]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:01 compute-1 sudo[94311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzpzkgnmgeijzrlhzrhwauxqdjklsepz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420760.4524548-842-15197753659588/AnsiballZ_file.py'
Jan 26 09:46:01 compute-1 sudo[94311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:46:01 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:46:01 compute-1 python3.9[94313]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:46:01 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:46:01 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:46:01 compute-1 ceph-mon[80107]: pgmap v106: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 255 B/s wr, 0 op/s
Jan 26 09:46:01 compute-1 sudo[94311]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:02 compute-1 sudo[94464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acijuvpnqmtuxmurdznglflrhpawzcig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420761.78501-881-72792485439837/AnsiballZ_stat.py'
Jan 26 09:46:02 compute-1 sudo[94464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:46:02 compute-1 python3.9[94466]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:46:02 compute-1 sudo[94464]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:02 compute-1 sudo[94542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmvgicfggcmixojowmhgayndxseaeyce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420761.78501-881-72792485439837/AnsiballZ_file.py'
Jan 26 09:46:02 compute-1 sudo[94542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:46:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:02.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:02 compute-1 python3.9[94544]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:46:02 compute-1 sudo[94542]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:02 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:02 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:02 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b900032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:02.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:03 compute-1 ceph-mon[80107]: pgmap v107: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:46:03 compute-1 sudo[94695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddtorgfhizsjwqowhxteizyojoyigkzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420763.529034-926-134032730115991/AnsiballZ_dnf.py'
Jan 26 09:46:03 compute-1 sudo[94695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:46:04 compute-1 python3.9[94697]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:46:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:04.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:46:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:04 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:04 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:04 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:46:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:04.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:46:05 compute-1 sudo[94695]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:05 compute-1 ceph-mon[80107]: pgmap v108: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:46:06 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:46:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094606 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:46:06 compute-1 python3.9[94849]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:46:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:06.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:06 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:06 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:06 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:06.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:07 compute-1 python3.9[95001]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 26 09:46:07 compute-1 ceph-mon[80107]: pgmap v109: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 26 09:46:07 compute-1 python3.9[95152]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:46:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:08.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:08 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:08 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:08 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:08.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:09 compute-1 sudo[95302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdpznbtmsopwmfhqdtchoacmfhnwruvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420768.50512-1049-13764220704255/AnsiballZ_systemd.py'
Jan 26 09:46:09 compute-1 sudo[95302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:46:09 compute-1 python3.9[95304]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:46:09 compute-1 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 26 09:46:09 compute-1 systemd[1]: tuned.service: Deactivated successfully.
Jan 26 09:46:09 compute-1 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 26 09:46:09 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 26 09:46:09 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 26 09:46:09 compute-1 sudo[95302]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:09 compute-1 ceph-mon[80107]: pgmap v110: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:46:10 compute-1 python3.9[95466]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 26 09:46:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:10.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:10 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:10 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:10 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:46:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:10.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:46:11 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:46:11 compute-1 ceph-mon[80107]: pgmap v111: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:46:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:12.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:12 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:12 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:12 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:12.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:14 compute-1 sudo[95618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imsvfticramkstsjbzvvqeaxrhbjajqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420773.74736-1220-98162431351224/AnsiballZ_systemd.py'
Jan 26 09:46:14 compute-1 sudo[95618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:46:14 compute-1 ceph-mon[80107]: pgmap v112: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:46:14 compute-1 python3.9[95620]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:46:14 compute-1 sudo[95618]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:46:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:14.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:46:14 compute-1 sudo[95772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qywtnbrhpxztlsqdfvpdwdafwvjfcytq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420774.471657-1220-212829316379462/AnsiballZ_systemd.py'
Jan 26 09:46:14 compute-1 sudo[95772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:46:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:14 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:14 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:14 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:46:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:14.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:46:15 compute-1 python3.9[95774]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:46:15 compute-1 sudo[95772]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:15 compute-1 sudo[95777]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:46:15 compute-1 sudo[95777]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:46:15 compute-1 sudo[95777]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:15 : epoch 697737bb : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:46:15 compute-1 ceph-mon[80107]: pgmap v113: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:46:15 compute-1 sshd-session[89222]: Connection closed by 192.168.122.30 port 40942
Jan 26 09:46:15 compute-1 sshd-session[89219]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:46:15 compute-1 systemd[1]: session-38.scope: Deactivated successfully.
Jan 26 09:46:15 compute-1 systemd[1]: session-38.scope: Consumed 1min 3.016s CPU time.
Jan 26 09:46:15 compute-1 systemd-logind[783]: Session 38 logged out. Waiting for processes to exit.
Jan 26 09:46:15 compute-1 systemd-logind[783]: Removed session 38.
Jan 26 09:46:16 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:46:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:16.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:16 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:16 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:16 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:16.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:17 compute-1 ceph-mon[80107]: pgmap v114: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:46:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:18 : epoch 697737bb : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:46:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:18 : epoch 697737bb : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:46:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 09:46:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:18.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 09:46:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:46:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:18 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:18 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:18 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:18.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:19 compute-1 ceph-mon[80107]: pgmap v115: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:46:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:20.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:20 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:20 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:20 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:20.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:21 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:46:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:21 : epoch 697737bb : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 09:46:21 compute-1 sshd-session[95833]: Accepted publickey for zuul from 192.168.122.30 port 60266 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:46:21 compute-1 systemd-logind[783]: New session 39 of user zuul.
Jan 26 09:46:21 compute-1 systemd[1]: Started Session 39 of User zuul.
Jan 26 09:46:21 compute-1 sshd-session[95833]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:46:21 compute-1 ceph-mon[80107]: pgmap v116: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 09:46:22 compute-1 python3.9[95987]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:46:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 09:46:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:22.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 09:46:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:22 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:22 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:22 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:22.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:23 compute-1 sudo[96142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abvhhsheilhsnekurmzpzmlkinqsbyux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420783.2167926-64-50858499716452/AnsiballZ_getent.py'
Jan 26 09:46:23 compute-1 sudo[96142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:46:23 compute-1 python3.9[96144]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 26 09:46:23 compute-1 sudo[96142]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:23 compute-1 ceph-mon[80107]: pgmap v117: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 09:46:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:24.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:24 compute-1 sudo[96297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brgckhzrsicqvizomedobpyuzkdzrwlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420784.4528012-100-257760870108277/AnsiballZ_setup.py'
Jan 26 09:46:24 compute-1 sudo[96297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:46:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:24 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:24 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:24 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b800016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 09:46:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:24.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 09:46:25 compute-1 python3.9[96299]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 09:46:25 compute-1 sudo[96297]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:25 compute-1 sudo[96382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byvwxzxctvuptgvkyltykwrfntvbdmbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420784.4528012-100-257760870108277/AnsiballZ_dnf.py'
Jan 26 09:46:25 compute-1 sudo[96382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:46:25 compute-1 python3.9[96384]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 09:46:25 compute-1 ceph-mon[80107]: pgmap v118: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 09:46:26 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:46:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094626 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:46:26 compute-1 sshd-session[96199]: Connection closed by authenticating user root 152.42.136.110 port 58854 [preauth]
Jan 26 09:46:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:26.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:26 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:26 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:26 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:26.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:27 compute-1 sudo[96382]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:27 compute-1 sudo[96536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klbivijttfzsprcblupvhrfifrlqrqfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420787.5502968-142-38077612744904/AnsiballZ_dnf.py'
Jan 26 09:46:27 compute-1 sudo[96536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:46:27 compute-1 ceph-mon[80107]: pgmap v119: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:46:28 compute-1 python3.9[96538]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:46:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:28.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:28 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b800016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:28 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:28 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:28.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:29 compute-1 sudo[96536]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:29 compute-1 ceph-mon[80107]: pgmap v120: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 26 09:46:30 compute-1 sshd-session[96540]: Connection closed by authenticating user root 178.62.249.31 port 43802 [preauth]
Jan 26 09:46:30 compute-1 sudo[96692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htvuimbbxsdjyfjcwvpprcxplvuglroi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420789.7703428-166-21179296642818/AnsiballZ_systemd.py'
Jan 26 09:46:30 compute-1 sudo[96692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:46:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 09:46:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:30.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 09:46:30 compute-1 python3.9[96694]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 09:46:30 compute-1 sudo[96692]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:30 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:30 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003f70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:30 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:30 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b800016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:30 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:30 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:30.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:31 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:46:31 compute-1 python3.9[96848]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:46:31 compute-1 ceph-mon[80107]: pgmap v121: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 26 09:46:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 09:46:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:32.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 09:46:32 compute-1 sudo[96998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cquhcpoqqcsbmdijulbxvknbhdwxcigr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420792.0930877-220-219823446418604/AnsiballZ_sefcontext.py'
Jan 26 09:46:32 compute-1 sudo[96998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:46:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:32 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:32 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003f90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:32 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:32.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:33 compute-1 python3.9[97000]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 26 09:46:33 compute-1 sudo[96998]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:33 compute-1 ceph-mon[80107]: pgmap v122: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:46:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:46:34 compute-1 python3.9[97151]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:46:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 09:46:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:34.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 09:46:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:34 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:34 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c0036e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:34 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:34.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:35 compute-1 sudo[97307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvtrmlmgocpxjqivwpnrvdvalmgfkzzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420794.822402-274-44005466597666/AnsiballZ_dnf.py'
Jan 26 09:46:35 compute-1 sudo[97307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:46:35 compute-1 sudo[97310]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:46:35 compute-1 sudo[97310]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:46:35 compute-1 sudo[97310]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:35 compute-1 python3.9[97309]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:46:35 compute-1 ceph-mon[80107]: pgmap v123: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:46:36 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:46:36 compute-1 sudo[97307]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:36.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:36 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:36 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:36 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c0036e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:36.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:37 compute-1 sudo[97487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pudqxpirpmsjmcigffoksrvzzvxoelvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420797.0443587-298-196163368428218/AnsiballZ_command.py'
Jan 26 09:46:37 compute-1 sudo[97487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:46:37 compute-1 python3.9[97489]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:46:38 compute-1 ceph-mon[80107]: pgmap v124: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:46:38 compute-1 sudo[97487]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:38.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:38 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:38 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:38 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:38.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:39 compute-1 sudo[97774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxfgkqjxnqhtgbxjrmuzwwrurbshhbxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420798.6736317-322-277939566810386/AnsiballZ_file.py'
Jan 26 09:46:39 compute-1 sudo[97774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:46:39 compute-1 python3.9[97776]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 26 09:46:39 compute-1 sudo[97774]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:40 compute-1 ceph-mon[80107]: pgmap v125: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:46:40 compute-1 python3.9[97927]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:46:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 09:46:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:40.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 09:46:40 compute-1 sudo[98079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyflyjwzaqptgwparsgngofuhklxgtss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420800.4813259-370-21144748835325/AnsiballZ_dnf.py'
Jan 26 09:46:40 compute-1 sudo[98079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:46:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:40 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c0036e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:40 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:40 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:40.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:41 compute-1 python3.9[98081]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:46:41 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:46:42 compute-1 ceph-mon[80107]: pgmap v126: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:46:42 compute-1 sudo[98079]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 09:46:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:42.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 09:46:42 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:42 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:42 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:42 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c0036e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:42 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:42 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:42.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:43 compute-1 ceph-mon[80107]: pgmap v127: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:46:44 compute-1 sudo[98234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-latjngbpnwwsvvdukrsaqsndkgybrhtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420803.8740158-397-173587323639811/AnsiballZ_dnf.py'
Jan 26 09:46:44 compute-1 sudo[98234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:46:44 compute-1 python3.9[98236]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:46:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:44.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:44 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:44 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:44 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:44.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:45 compute-1 ceph-mon[80107]: pgmap v128: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:46:45 compute-1 sudo[98234]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:46 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:46:46 compute-1 sudo[98388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njtpxsufecmdmepkwpliwdywlmunhchn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420806.27761-433-110292162669933/AnsiballZ_stat.py'
Jan 26 09:46:46 compute-1 sudo[98388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:46:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:46.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:46 compute-1 python3.9[98390]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:46:46 compute-1 sudo[98388]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:46 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:46 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84004010 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:46 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:46.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:47 compute-1 sudo[98542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwtonxonhskqxfernqtlsbtloxwxlore ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420807.0287273-457-191075679677612/AnsiballZ_slurp.py'
Jan 26 09:46:47 compute-1 sudo[98542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:46:47 compute-1 python3.9[98545]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Jan 26 09:46:47 compute-1 sudo[98542]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:47 compute-1 ceph-mon[80107]: pgmap v129: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 26 09:46:48 compute-1 sshd-session[98546]: Invalid user admin from 104.248.198.71 port 50138
Jan 26 09:46:48 compute-1 sshd-session[98546]: Connection closed by invalid user admin 104.248.198.71 port 50138 [preauth]
Jan 26 09:46:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 09:46:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:48.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 09:46:48 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:46:48 compute-1 sshd-session[95836]: Connection closed by 192.168.122.30 port 60266
Jan 26 09:46:48 compute-1 sshd-session[95833]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:46:48 compute-1 systemd[1]: session-39.scope: Deactivated successfully.
Jan 26 09:46:48 compute-1 systemd[1]: session-39.scope: Consumed 18.227s CPU time.
Jan 26 09:46:48 compute-1 systemd-logind[783]: Session 39 logged out. Waiting for processes to exit.
Jan 26 09:46:48 compute-1 systemd-logind[783]: Removed session 39.
Jan 26 09:46:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:48 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c0036e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:48 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:48 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84004030 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:48.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:49 compute-1 ceph-mon[80107]: pgmap v130: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:46:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 09:46:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:50.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 09:46:50 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:50 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:50 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:50 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:50 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:50 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:50.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:51 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:46:51 compute-1 ceph-mon[80107]: pgmap v131: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:46:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 09:46:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:52.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 09:46:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:52 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:52 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:52 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 09:46:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:52.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 09:46:53 compute-1 sshd-session[98577]: Accepted publickey for zuul from 192.168.122.30 port 46592 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:46:53 compute-1 systemd-logind[783]: New session 40 of user zuul.
Jan 26 09:46:53 compute-1 systemd[1]: Started Session 40 of User zuul.
Jan 26 09:46:53 compute-1 sshd-session[98577]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:46:53 compute-1 ceph-mon[80107]: pgmap v132: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:46:54 compute-1 python3.9[98731]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:46:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:54.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:54 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:54 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba8000f90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:54 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 09:46:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:54.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 09:46:55 compute-1 sudo[98886]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:46:55 compute-1 sudo[98886]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:46:55 compute-1 sudo[98886]: pam_unix(sudo:session): session closed for user root
Jan 26 09:46:55 compute-1 python3.9[98885]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 09:46:55 compute-1 ceph-mon[80107]: pgmap v133: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:46:56 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:46:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:56.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:56 compute-1 python3.9[99104]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:46:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:56 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b840041f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:56 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba8000f90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:56 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:57.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:57 compute-1 sshd-session[98580]: Connection closed by 192.168.122.30 port 46592
Jan 26 09:46:57 compute-1 sshd-session[98577]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:46:57 compute-1 systemd[1]: session-40.scope: Deactivated successfully.
Jan 26 09:46:57 compute-1 systemd[1]: session-40.scope: Consumed 2.288s CPU time.
Jan 26 09:46:57 compute-1 systemd-logind[783]: Session 40 logged out. Waiting for processes to exit.
Jan 26 09:46:57 compute-1 systemd-logind[783]: Removed session 40.
Jan 26 09:46:57 compute-1 ceph-mon[80107]: pgmap v134: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 26 09:46:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:46:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:58.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:46:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:58 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:58 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b840041f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:58 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba8000f90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:46:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:46:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 09:46:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:59.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 09:46:59 compute-1 ceph-mon[80107]: pgmap v135: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:47:00 compute-1 sudo[99132]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:47:00 compute-1 sudo[99132]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:47:00 compute-1 sudo[99132]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:00.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:00 compute-1 sudo[99157]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 09:47:00 compute-1 sudo[99157]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:47:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:00 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:00 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:00 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:01.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:01 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:47:01 compute-1 sudo[99157]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:01 compute-1 ceph-mon[80107]: pgmap v136: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:47:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:02.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:02 compute-1 sshd-session[99215]: Accepted publickey for zuul from 192.168.122.30 port 33942 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:47:02 compute-1 systemd-logind[783]: New session 41 of user zuul.
Jan 26 09:47:02 compute-1 systemd[1]: Started Session 41 of User zuul.
Jan 26 09:47:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:02 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:02 compute-1 sshd-session[99215]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:47:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:02 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b840041f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:02 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b840041f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:03.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:03 compute-1 sshd-session[71446]: Received disconnect from 38.102.83.222 port 48076:11: disconnected by user
Jan 26 09:47:03 compute-1 sshd-session[71446]: Disconnected from user zuul 38.102.83.222 port 48076
Jan 26 09:47:03 compute-1 sshd-session[71443]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:47:03 compute-1 systemd[1]: session-19.scope: Deactivated successfully.
Jan 26 09:47:03 compute-1 systemd[1]: session-19.scope: Consumed 9.012s CPU time.
Jan 26 09:47:03 compute-1 systemd-logind[783]: Session 19 logged out. Waiting for processes to exit.
Jan 26 09:47:03 compute-1 systemd-logind[783]: Removed session 19.
Jan 26 09:47:03 compute-1 ceph-mon[80107]: pgmap v137: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:47:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:47:03 compute-1 python3.9[99369]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:47:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 09:47:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:04.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 09:47:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:04 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:04 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0003500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:04 compute-1 python3.9[99523]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:47:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:04 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0003500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:05.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:05 compute-1 sudo[99678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quagozsfesxlznapsveyaxacegyzybvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420825.3186784-76-67842584355533/AnsiballZ_setup.py'
Jan 26 09:47:05 compute-1 sudo[99678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:05 compute-1 python3.9[99680]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 09:47:05 compute-1 ceph-mon[80107]: pgmap v138: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:47:05 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:47:05 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:47:05 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:47:05 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 09:47:05 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:47:05 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:47:05 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 09:47:05 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 09:47:05 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:47:06 compute-1 sudo[99678]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:06 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:47:06 compute-1 sudo[99762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtbgmjbjkyyyqrkbxxnxrailylrapndf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420825.3186784-76-67842584355533/AnsiballZ_dnf.py'
Jan 26 09:47:06 compute-1 sudo[99762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 09:47:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:06.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 09:47:06 compute-1 python3.9[99764]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:47:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:06 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba8008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:06 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:06 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0003500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:07.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:07 compute-1 sudo[99762]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:07 compute-1 ceph-mon[80107]: pgmap v139: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 26 09:47:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 09:47:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:08.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 09:47:08 compute-1 sudo[99918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guckqijckfnejinzkpciipspnnsaiplb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420828.4148872-112-20421299405019/AnsiballZ_setup.py'
Jan 26 09:47:08 compute-1 sudo[99918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:08 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0003500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:08 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba80096e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:08 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:09.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:09 compute-1 python3.9[99920]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 09:47:09 compute-1 sudo[99918]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:09 compute-1 sshd-session[99791]: Connection closed by authenticating user root 152.42.136.110 port 37304 [preauth]
Jan 26 09:47:10 compute-1 ceph-mon[80107]: pgmap v140: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:47:10 compute-1 sudo[100114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvtxulvfpwqbztexdeqqbgixizarfnlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420829.7262979-145-248072983192780/AnsiballZ_file.py'
Jan 26 09:47:10 compute-1 sudo[100114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:10 compute-1 python3.9[100116]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:47:10 compute-1 sudo[100114]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:10 compute-1 sudo[100153]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 09:47:10 compute-1 sudo[100153]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:47:10 compute-1 sudo[100153]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:10.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:10 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0003500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:10 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0003500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:10 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba80096e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:11.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:11 compute-1 sudo[100291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwidqrhbcgiuoanyavaojqabxuahpsuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420830.587056-169-151928916812896/AnsiballZ_command.py'
Jan 26 09:47:11 compute-1 sudo[100291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:11 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:47:11 compute-1 python3.9[100293]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:47:11 compute-1 sudo[100291]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:11 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:47:11 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:47:11 compute-1 ceph-mon[80107]: pgmap v141: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:47:12 compute-1 sudo[100457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-moznnycshxhribapztdzmaoasflzgmxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420831.5481124-193-6943452980205/AnsiballZ_stat.py'
Jan 26 09:47:12 compute-1 sudo[100457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:12 compute-1 python3.9[100459]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:47:12 compute-1 sudo[100457]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:12 compute-1 sudo[100535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzivttumtiacxpmovikkogiovoxvwtnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420831.5481124-193-6943452980205/AnsiballZ_file.py'
Jan 26 09:47:12 compute-1 sudo[100535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:12 compute-1 python3.9[100537]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:47:12 compute-1 sudo[100535]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:12.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:12 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:12 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:12 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:13.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:13 compute-1 sudo[100687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edkkppwexqlxwikhznfhgwsnkhvbzlbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420832.9173493-229-182726585654644/AnsiballZ_stat.py'
Jan 26 09:47:13 compute-1 sudo[100687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:13 compute-1 python3.9[100689]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:47:13 compute-1 sudo[100687]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:13 compute-1 sudo[100766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfyovauyexccmdwchicpbxgeyytzvmro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420832.9173493-229-182726585654644/AnsiballZ_file.py'
Jan 26 09:47:13 compute-1 sudo[100766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:13 compute-1 ceph-mon[80107]: pgmap v142: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:47:13 compute-1 python3.9[100768]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:47:13 compute-1 sudo[100766]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094714 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:47:14 compute-1 sudo[100918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blhtrauxeiubjvamgtstircrzdaclfwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420834.1349287-268-52877163691457/AnsiballZ_ini_file.py'
Jan 26 09:47:14 compute-1 sudo[100918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:14 compute-1 python3.9[100920]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:47:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:14.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:14 compute-1 sudo[100918]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:14 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:14 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b840046d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:14 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0003500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:47:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:15.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:47:15 compute-1 sudo[101070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgswrfnpkddadehifmhzwnwmqftjqzwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420834.8778913-268-219476520362450/AnsiballZ_ini_file.py'
Jan 26 09:47:15 compute-1 sudo[101070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:15 compute-1 python3.9[101072]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:47:15 compute-1 sudo[101070]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:15 compute-1 sudo[101073]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:47:15 compute-1 sudo[101073]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:47:15 compute-1 sudo[101073]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:15 compute-1 ceph-mon[80107]: pgmap v143: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:47:15 compute-1 sudo[101248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alzlhapgajoolncijqxlmkzjpimxxfpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420835.4956045-268-110000502908072/AnsiballZ_ini_file.py'
Jan 26 09:47:15 compute-1 sudo[101248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:15 compute-1 python3.9[101250]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:47:15 compute-1 sudo[101248]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:16 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:47:16 compute-1 sudo[101400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wutterdohvnfhlsgxzbbzejategjjzll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420836.080587-268-147650952292125/AnsiballZ_ini_file.py'
Jan 26 09:47:16 compute-1 sudo[101400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:16 compute-1 python3.9[101402]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:47:16 compute-1 sudo[101400]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:16.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:16 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba80096e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:16 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:16 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b840046f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:17.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:17 compute-1 sudo[101555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imsfxsxzkzpgdqoxrhozdxgbshyqiwbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420837.2385325-361-158251141776860/AnsiballZ_dnf.py'
Jan 26 09:47:17 compute-1 sudo[101555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:17 compute-1 python3.9[101557]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:47:17 compute-1 ceph-mon[80107]: pgmap v144: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 26 09:47:18 compute-1 sshd-session[101427]: Connection closed by authenticating user root 178.62.249.31 port 58242 [preauth]
Jan 26 09:47:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:47:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:18.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:18 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0003500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:18 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba80096e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:18 compute-1 sudo[101555]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:18 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:19.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:19 compute-1 ceph-mon[80107]: pgmap v145: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:47:19 compute-1 sudo[101709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhhetyakivfknkmcfdhfxvpdyhojmuku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420839.661177-394-185253812672052/AnsiballZ_setup.py'
Jan 26 09:47:19 compute-1 sudo[101709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:20 compute-1 python3.9[101711]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:47:20 compute-1 sudo[101709]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:20.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:20 compute-1 sudo[101863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwwbknykbjqgdoqnaymxpscyggpdsplk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420840.5053596-418-247003290116212/AnsiballZ_stat.py'
Jan 26 09:47:20 compute-1 sudo[101863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:20 compute-1 python3.9[101865]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:47:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:20 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84004cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:20 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0003500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:20 compute-1 sudo[101863]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:20 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba800a7e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:21.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:21 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:47:21 compute-1 sudo[102016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzxonpmhtmndsisdxiwgoazqvmiwxlpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420841.219939-445-234319789926067/AnsiballZ_stat.py'
Jan 26 09:47:21 compute-1 sudo[102016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:21 compute-1 python3.9[102018]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:47:21 compute-1 sudo[102016]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:21 compute-1 ceph-mon[80107]: pgmap v146: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:47:22 compute-1 sudo[102169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxvhsfalytjudsrxxvxgcgyaqtrpdbbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420842.1024516-475-23793349597411/AnsiballZ_command.py'
Jan 26 09:47:22 compute-1 sudo[102169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:22 compute-1 python3.9[102171]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:47:22 compute-1 sudo[102169]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:47:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:22.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:47:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:22 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:22 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84004cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:22 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0003500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:23.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:23 : epoch 697737bb : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:47:23 compute-1 sudo[102324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmdyhfdykzfxgwenvefxalimunbeviux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420843.0923283-505-127043884062321/AnsiballZ_service_facts.py'
Jan 26 09:47:23 compute-1 sudo[102324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094723 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:47:23 compute-1 python3.9[102326]: ansible-service_facts Invoked
Jan 26 09:47:23 compute-1 ceph-mon[80107]: pgmap v147: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:47:23 compute-1 network[102343]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 09:47:23 compute-1 network[102344]: 'network-scripts' will be removed from distribution in near future.
Jan 26 09:47:23 compute-1 network[102345]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 09:47:24 compute-1 sshd-session[102019]: Invalid user admin from 176.120.22.13 port 26902
Jan 26 09:47:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:24.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:24 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba800a7e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:24 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:24 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c001000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:25 compute-1 sshd-session[102019]: Connection reset by invalid user admin 176.120.22.13 port 26902 [preauth]
Jan 26 09:47:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:25.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:25 compute-1 ceph-mon[80107]: pgmap v148: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:47:26 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:47:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:26 : epoch 697737bb : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:47:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:26 : epoch 697737bb : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:47:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:26.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:26 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0003500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:26 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba800a7e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:26 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:27.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:27 compute-1 sshd-session[102379]: Connection reset by authenticating user root 176.120.22.13 port 49168 [preauth]
Jan 26 09:47:27 compute-1 sudo[102324]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:27 compute-1 ceph-mon[80107]: pgmap v149: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 426 B/s wr, 1 op/s
Jan 26 09:47:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:28.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:28 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c001000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:28 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0003500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:28 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba800a7e0 fd 39 proxy ignored for local
Jan 26 09:47:28 compute-1 kernel: ganesha.nfsd[98575]: segfault at 50 ip 00007f4c3167c32e sp 00007f4b9effc210 error 4 in libntirpc.so.5.8[7f4c31661000+2c000] likely on CPU 6 (core 0, socket 6)
Jan 26 09:47:28 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 26 09:47:28 compute-1 systemd[1]: Started Process Core Dump (PID 102487/UID 0).
Jan 26 09:47:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:29.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:29 compute-1 sshd-session[102479]: Invalid user admin from 176.120.22.13 port 49180
Jan 26 09:47:29 compute-1 ceph-mon[80107]: pgmap v150: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 426 B/s wr, 1 op/s
Jan 26 09:47:29 compute-1 sshd-session[102479]: Connection reset by invalid user admin 176.120.22.13 port 49180 [preauth]
Jan 26 09:47:30 compute-1 systemd-coredump[102488]: Process 91504 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 61:
                                                    #0  0x00007f4c3167c32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    #1  0x0000000000000000 n/a (n/a + 0x0)
                                                    #2  0x00007f4c31686900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)
                                                    ELF object binary architecture: AMD x86-64
Jan 26 09:47:30 compute-1 systemd[1]: systemd-coredump@1-102487-0.service: Deactivated successfully.
Jan 26 09:47:30 compute-1 systemd[1]: systemd-coredump@1-102487-0.service: Consumed 1.084s CPU time.
Jan 26 09:47:30 compute-1 podman[102495]: 2026-01-26 09:47:30.160536133 +0000 UTC m=+0.028951176 container died f32869ac6ab743e2b48608e16b6d5e4d055ccee739772d437fabcd66b742e07f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:47:30 compute-1 systemd[1]: var-lib-containers-storage-overlay-acaa23a82ff129fe08b8ca585e5004a757ed19363666aee393d8ce5d14943f83-merged.mount: Deactivated successfully.
Jan 26 09:47:30 compute-1 podman[102495]: 2026-01-26 09:47:30.198815995 +0000 UTC m=+0.067231068 container remove f32869ac6ab743e2b48608e16b6d5e4d055ccee739772d437fabcd66b742e07f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1)
Jan 26 09:47:30 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Main process exited, code=exited, status=139/n/a
Jan 26 09:47:30 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Failed with result 'exit-code'.
Jan 26 09:47:30 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.482s CPU time.
Jan 26 09:47:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:47:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:30.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:47:31 compute-1 sudo[102688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epsgaaqfkoogrctjwzzkwmeheifoeevl ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1769420850.6312976-550-190586515363074/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1769420850.6312976-550-190586515363074/args'
Jan 26 09:47:31 compute-1 sudo[102688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:31.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:31 compute-1 sudo[102688]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:31 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:47:31 compute-1 sudo[102856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srnlksldgwrmeucgcnrjokrkpqzceofz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420851.4489808-583-224607655866450/AnsiballZ_dnf.py'
Jan 26 09:47:31 compute-1 sudo[102856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:31 compute-1 ceph-mon[80107]: pgmap v151: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 09:47:32 compute-1 python3.9[102858]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:47:32 compute-1 sshd-session[102491]: Invalid user 1234 from 176.120.22.13 port 49194
Jan 26 09:47:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:32.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:33.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:33 compute-1 sshd-session[102491]: Connection reset by invalid user 1234 176.120.22.13 port 49194 [preauth]
Jan 26 09:47:33 compute-1 sudo[102856]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:33 compute-1 ceph-mon[80107]: pgmap v152: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 09:47:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:47:34 compute-1 sudo[103012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmjrynnsilejirxxfcnyrakmmrudizim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420853.898418-622-114738921239848/AnsiballZ_package_facts.py'
Jan 26 09:47:34 compute-1 sudo[103012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:34.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:34 compute-1 python3.9[103014]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 26 09:47:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094734 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:47:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [NOTICE] 025/094734 (4) : haproxy version is 2.3.17-d1c9119
Jan 26 09:47:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [NOTICE] 025/094734 (4) : path to executable is /usr/local/sbin/haproxy
Jan 26 09:47:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [ALERT] 025/094734 (4) : backend 'backend' has no server available!
Jan 26 09:47:35 compute-1 sudo[103012]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:35.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:35 compute-1 sudo[103040]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:47:35 compute-1 sudo[103040]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:47:35 compute-1 sudo[103040]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:35 compute-1 sshd-session[102884]: Connection reset by authenticating user root 176.120.22.13 port 44646 [preauth]
Jan 26 09:47:35 compute-1 ceph-mon[80107]: pgmap v153: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 09:47:36 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:47:36 compute-1 sudo[103190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnwewomwmknylqczphbrvfpxpqiuoenc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420855.8410797-652-29517684264168/AnsiballZ_stat.py'
Jan 26 09:47:36 compute-1 sudo[103190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:36 compute-1 python3.9[103192]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:47:36 compute-1 sudo[103190]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:36 compute-1 sudo[103268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyeugsoevrpjbftwxiduylfiscymldqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420855.8410797-652-29517684264168/AnsiballZ_file.py'
Jan 26 09:47:36 compute-1 sudo[103268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:47:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:36.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:47:36 compute-1 python3.9[103270]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:47:36 compute-1 sudo[103268]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:37.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:37 compute-1 sudo[103420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfcfphgmvhmpqydqtpwlxwuntcxamean ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420857.1511145-689-1761373796722/AnsiballZ_stat.py'
Jan 26 09:47:37 compute-1 sudo[103420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:37 compute-1 python3.9[103423]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:47:37 compute-1 sudo[103420]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:37 compute-1 ceph-mon[80107]: pgmap v154: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 852 B/s wr, 2 op/s
Jan 26 09:47:37 compute-1 sudo[103499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scbustsnxwtrluwldwydumpgqfwxwlpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420857.1511145-689-1761373796722/AnsiballZ_file.py'
Jan 26 09:47:37 compute-1 sudo[103499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:38 compute-1 python3.9[103501]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:47:38 compute-1 sudo[103499]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094738 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 1ms. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:47:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:47:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:38.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:47:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:39.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:39 compute-1 sshd-session[103526]: Invalid user admin from 104.248.198.71 port 33360
Jan 26 09:47:39 compute-1 sshd-session[103526]: Connection closed by invalid user admin 104.248.198.71 port 33360 [preauth]
Jan 26 09:47:39 compute-1 ceph-mon[80107]: pgmap v155: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 511 B/s wr, 1 op/s
Jan 26 09:47:40 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Scheduled restart job, restart counter is at 2.
Jan 26 09:47:40 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:47:40 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.482s CPU time.
Jan 26 09:47:40 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 09:47:40 compute-1 sudo[103667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsxuqwuroloiywahbijtpsdastsvcjuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420860.0218847-743-143439948266258/AnsiballZ_lineinfile.py'
Jan 26 09:47:40 compute-1 sudo[103667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:40 compute-1 podman[103700]: 2026-01-26 09:47:40.593768898 +0000 UTC m=+0.042531631 container create cc086136e1c432f5ca4718ee9ab857dcab4db1a7efab79790d7b9f2c243517ef (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True)
Jan 26 09:47:40 compute-1 systemd[84066]: Created slice User Background Tasks Slice.
Jan 26 09:47:40 compute-1 systemd[84066]: Starting Cleanup of User's Temporary Files and Directories...
Jan 26 09:47:40 compute-1 systemd[84066]: Finished Cleanup of User's Temporary Files and Directories.
Jan 26 09:47:40 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ccab391d2ddf76ae48998672269732e3b03ef5875ece04442f9e9329de58aac3/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 09:47:40 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ccab391d2ddf76ae48998672269732e3b03ef5875ece04442f9e9329de58aac3/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 09:47:40 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ccab391d2ddf76ae48998672269732e3b03ef5875ece04442f9e9329de58aac3/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 09:47:40 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ccab391d2ddf76ae48998672269732e3b03ef5875ece04442f9e9329de58aac3/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 09:47:40 compute-1 python3.9[103670]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:47:40 compute-1 sudo[103667]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:40 compute-1 podman[103700]: 2026-01-26 09:47:40.572060521 +0000 UTC m=+0.020823264 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:47:40 compute-1 podman[103700]: 2026-01-26 09:47:40.697642331 +0000 UTC m=+0.146405034 container init cc086136e1c432f5ca4718ee9ab857dcab4db1a7efab79790d7b9f2c243517ef (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Jan 26 09:47:40 compute-1 podman[103700]: 2026-01-26 09:47:40.702203046 +0000 UTC m=+0.150965739 container start cc086136e1c432f5ca4718ee9ab857dcab4db1a7efab79790d7b9f2c243517ef (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 09:47:40 compute-1 bash[103700]: cc086136e1c432f5ca4718ee9ab857dcab4db1a7efab79790d7b9f2c243517ef
Jan 26 09:47:40 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:47:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:40 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 09:47:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:40 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 09:47:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:40 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 26 09:47:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:40 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 26 09:47:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:40 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 26 09:47:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:40 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 26 09:47:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:40.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:40 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 26 09:47:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:40 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:47:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:41.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:41 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:47:41 compute-1 ceph-mon[80107]: pgmap v156: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:47:42 compute-1 sudo[103908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrnhjkbhsbjikrnzdafwhqktqxeaskug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420861.6703632-787-88345395881795/AnsiballZ_setup.py'
Jan 26 09:47:42 compute-1 sudo[103908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:42 compute-1 python3.9[103910]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 09:47:42 compute-1 sudo[103908]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:42.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:42 compute-1 sudo[103992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grnldzzvmwxurctbcgekuzkybsghhceh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420861.6703632-787-88345395881795/AnsiballZ_systemd.py'
Jan 26 09:47:42 compute-1 sudo[103992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:43.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:43 compute-1 python3.9[103994]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:47:43 compute-1 sudo[103992]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:44 compute-1 ceph-mon[80107]: pgmap v157: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:47:44 compute-1 sshd-session[99218]: Connection closed by 192.168.122.30 port 33942
Jan 26 09:47:44 compute-1 sshd-session[99215]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:47:44 compute-1 systemd[1]: session-41.scope: Deactivated successfully.
Jan 26 09:47:44 compute-1 systemd[1]: session-41.scope: Consumed 23.866s CPU time.
Jan 26 09:47:44 compute-1 systemd-logind[783]: Session 41 logged out. Waiting for processes to exit.
Jan 26 09:47:44 compute-1 systemd-logind[783]: Removed session 41.
Jan 26 09:47:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:44.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:45.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094745 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:47:46 compute-1 ceph-mon[80107]: pgmap v158: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.9 KiB/s rd, 1.6 KiB/s wr, 4 op/s
Jan 26 09:47:46 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:47:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:46.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:46 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Jan 26 09:47:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:46 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Jan 26 09:47:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:46 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:47:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:46 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:47:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:47:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:47.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:47:48 compute-1 ceph-mon[80107]: pgmap v159: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.9 KiB/s rd, 1.6 KiB/s wr, 4 op/s
Jan 26 09:47:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:47:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:48.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:47:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:49.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:47:49 compute-1 sshd-session[104024]: Accepted publickey for zuul from 192.168.122.30 port 48520 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:47:49 compute-1 systemd-logind[783]: New session 42 of user zuul.
Jan 26 09:47:49 compute-1 systemd[1]: Started Session 42 of User zuul.
Jan 26 09:47:49 compute-1 sshd-session[104024]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:47:50 compute-1 sudo[104178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfeccrdcpcrmemucbndilwpblzkzucgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420869.6112647-22-46838412780595/AnsiballZ_file.py'
Jan 26 09:47:50 compute-1 sudo[104178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:50 compute-1 ceph-mon[80107]: pgmap v160: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1.1 KiB/s wr, 3 op/s
Jan 26 09:47:50 compute-1 python3.9[104180]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:47:50 compute-1 sudo[104178]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:50.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:51.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:51 compute-1 sudo[104330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mthxudivdfknuvlwphuaxwloddayacjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420870.6004133-58-915213449739/AnsiballZ_stat.py'
Jan 26 09:47:51 compute-1 sudo[104330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:51 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:47:51 compute-1 python3.9[104332]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:47:51 compute-1 sudo[104330]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:51 compute-1 sudo[104409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjtyobpqwslnpbappuuypoyecmmfsale ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420870.6004133-58-915213449739/AnsiballZ_file.py'
Jan 26 09:47:51 compute-1 sudo[104409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:51 compute-1 ceph-mon[80107]: pgmap v161: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.0 KiB/s rd, 1.4 KiB/s wr, 4 op/s
Jan 26 09:47:51 compute-1 python3.9[104411]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:47:51 compute-1 sudo[104409]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:51 compute-1 sshd-session[104412]: Connection closed by authenticating user root 152.42.136.110 port 58368 [preauth]
Jan 26 09:47:52 compute-1 sshd-session[104028]: Connection closed by 192.168.122.30 port 48520
Jan 26 09:47:52 compute-1 sshd-session[104024]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:47:52 compute-1 systemd[1]: session-42.scope: Deactivated successfully.
Jan 26 09:47:52 compute-1 systemd[1]: session-42.scope: Consumed 1.528s CPU time.
Jan 26 09:47:52 compute-1 systemd-logind[783]: Session 42 logged out. Waiting for processes to exit.
Jan 26 09:47:52 compute-1 systemd-logind[783]: Removed session 42.
Jan 26 09:47:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:52.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000009:nfs.cephfs.0: -2
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe710000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6ec000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:53.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:53 compute-1 ceph-mon[80107]: pgmap v162: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 26 09:47:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:54.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:54 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6e0000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:54 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe710001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094754 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:47:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:54 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:47:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:55.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:47:55 compute-1 sudo[104456]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:47:55 compute-1 sudo[104456]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:47:55 compute-1 sudo[104456]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:55 compute-1 ceph-mon[80107]: pgmap v163: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:47:56 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:47:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:56.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:56 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6ec0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:56 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6e00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:56 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe710001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:57.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:57 compute-1 sshd-session[104481]: Accepted publickey for zuul from 192.168.122.30 port 57200 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:47:57 compute-1 systemd-logind[783]: New session 43 of user zuul.
Jan 26 09:47:57 compute-1 systemd[1]: Started Session 43 of User zuul.
Jan 26 09:47:57 compute-1 sshd-session[104481]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:47:57 compute-1 ceph-mon[80107]: pgmap v164: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 09:47:58 compute-1 python3.9[104635]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:47:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:47:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:58.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:47:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:58 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:58 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6ec0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:58 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6e00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:47:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:47:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:47:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:59.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:47:59 compute-1 sudo[104789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcmflbfqkxkiuswldgqiagnlzhaddhsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420878.9471824-55-96832209199682/AnsiballZ_file.py'
Jan 26 09:47:59 compute-1 sudo[104789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:47:59 compute-1 python3.9[104791]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:47:59 compute-1 sudo[104789]: pam_unix(sudo:session): session closed for user root
Jan 26 09:47:59 compute-1 ceph-mon[80107]: pgmap v165: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 09:48:00 compute-1 sudo[104965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggnzebpdxwzranxmreslakkepiqfstux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420879.8255754-79-132621356397148/AnsiballZ_stat.py'
Jan 26 09:48:00 compute-1 sudo[104965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:00 compute-1 python3.9[104967]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:48:00 compute-1 sudo[104965]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:00.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:00 compute-1 sudo[105043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjpwppdgpvadiicojkksdxhzdpftfxfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420879.8255754-79-132621356397148/AnsiballZ_file.py'
Jan 26 09:48:00 compute-1 sudo[105043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:00 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7100089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:00 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:00 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6ec0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:01.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:01 compute-1 python3.9[105045]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.49xu_gz8 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:48:01 compute-1 sudo[105043]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:01 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:48:01 compute-1 ceph-mon[80107]: pgmap v166: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 09:48:01 compute-1 sudo[105196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvonsfphefgwonbimjxikjwimbvpavjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420881.6783025-139-105323910072871/AnsiballZ_stat.py'
Jan 26 09:48:01 compute-1 sudo[105196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:02 compute-1 python3.9[105198]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:48:02 compute-1 sudo[105196]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:02 compute-1 sudo[105274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-roxngzfadjtsearlzynhdclhhbmwxwnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420881.6783025-139-105323910072871/AnsiballZ_file.py'
Jan 26 09:48:02 compute-1 sudo[105274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:02 compute-1 python3.9[105276]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.k66gfg_f recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:48:02 compute-1 sudo[105274]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:02.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:02 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6e00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:02 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7100089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:02 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:03.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:03 compute-1 sudo[105426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmtchkxzarxgoaremjsqokjhxinlbsxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420882.9014297-178-199978697706037/AnsiballZ_file.py'
Jan 26 09:48:03 compute-1 sudo[105426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:03 compute-1 python3.9[105428]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:48:03 compute-1 sudo[105426]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094803 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:48:03 compute-1 ceph-mon[80107]: pgmap v167: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:48:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:48:03 compute-1 sudo[105579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzaluhxydirniqqcqemwlkepehjgjupq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420883.5528169-202-79134738417764/AnsiballZ_stat.py'
Jan 26 09:48:03 compute-1 sudo[105579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:03 compute-1 python3.9[105581]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:48:03 compute-1 sudo[105579]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:04 compute-1 sudo[105657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esylrytgpctylpnbulqghcloxfxeiysu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420883.5528169-202-79134738417764/AnsiballZ_file.py'
Jan 26 09:48:04 compute-1 sudo[105657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:04 compute-1 python3.9[105659]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:48:04 compute-1 sudo[105657]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:04.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:04 compute-1 sudo[105810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnuawsfcbrvptbfjrizztcsltwkbeemv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420884.557274-202-87439641029601/AnsiballZ_stat.py'
Jan 26 09:48:04 compute-1 sudo[105810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:04 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6ec002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:04 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6e0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:04 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7100096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:04 compute-1 python3.9[105812]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:48:05 compute-1 sudo[105810]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:05.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:05 compute-1 sudo[105889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvjtojukkotaaeywuvfeturmnnppfaar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420884.557274-202-87439641029601/AnsiballZ_file.py'
Jan 26 09:48:05 compute-1 sudo[105889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:05 compute-1 python3.9[105891]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:48:05 compute-1 sudo[105889]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:05 compute-1 ceph-mon[80107]: pgmap v168: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:48:05 compute-1 sshd-session[105759]: Connection closed by authenticating user root 178.62.249.31 port 57034 [preauth]
Jan 26 09:48:06 compute-1 sudo[106042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwykvylhiogntjblafqnhhyqrzxhklil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420885.797423-271-243995591908080/AnsiballZ_file.py'
Jan 26 09:48:06 compute-1 sudo[106042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:06 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:48:06 compute-1 python3.9[106044]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:48:06 compute-1 sudo[106042]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:06.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:06 compute-1 sudo[106194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogtpztkdmzzkfyrtnpwyqsdfyzeozzwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420886.5490575-295-64919526996316/AnsiballZ_stat.py'
Jan 26 09:48:06 compute-1 sudo[106194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:06 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:06 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6ec002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:06 compute-1 python3.9[106196]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:48:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:06 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7100096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:07 compute-1 sudo[106194]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:07.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:07 compute-1 sudo[106272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iczkltzjaoxhpcunlbtoupgozziipulg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420886.5490575-295-64919526996316/AnsiballZ_file.py'
Jan 26 09:48:07 compute-1 sudo[106272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:07 compute-1 python3.9[106274]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:48:07 compute-1 sudo[106272]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.689261) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420887689308, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1884, "num_deletes": 250, "total_data_size": 5391717, "memory_usage": 5477968, "flush_reason": "Manual Compaction"}
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420887705108, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 2272474, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10582, "largest_seqno": 12461, "table_properties": {"data_size": 2266628, "index_size": 2981, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 14457, "raw_average_key_size": 20, "raw_value_size": 2253941, "raw_average_value_size": 3147, "num_data_blocks": 132, "num_entries": 716, "num_filter_entries": 716, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420723, "oldest_key_time": 1769420723, "file_creation_time": 1769420887, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 15913 microseconds, and 5537 cpu microseconds.
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.705175) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 2272474 bytes OK
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.705204) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.706690) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.706733) EVENT_LOG_v1 {"time_micros": 1769420887706728, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.706752) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 5383367, prev total WAL file size 5383367, number of live WAL files 2.
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.708288) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323531' seq:0, type:0; will stop at (end)
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(2219KB)], [21(14MB)]
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420887708560, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 17079505, "oldest_snapshot_seqno": -1}
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 4327 keys, 15332897 bytes, temperature: kUnknown
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420887805375, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 15332897, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15299746, "index_size": 21201, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10885, "raw_key_size": 109502, "raw_average_key_size": 25, "raw_value_size": 15216508, "raw_average_value_size": 3516, "num_data_blocks": 911, "num_entries": 4327, "num_filter_entries": 4327, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769420887, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.805607) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 15332897 bytes
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.809749) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 176.3 rd, 158.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.2, 14.1 +0.0 blob) out(14.6 +0.0 blob), read-write-amplify(14.3) write-amplify(6.7) OK, records in: 4772, records dropped: 445 output_compression: NoCompression
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.809773) EVENT_LOG_v1 {"time_micros": 1769420887809764, "job": 10, "event": "compaction_finished", "compaction_time_micros": 96886, "compaction_time_cpu_micros": 29576, "output_level": 6, "num_output_files": 1, "total_output_size": 15332897, "num_input_records": 4772, "num_output_records": 4327, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420887810336, "job": 10, "event": "table_file_deletion", "file_number": 23}
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420887812867, "job": 10, "event": "table_file_deletion", "file_number": 21}
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.708171) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.812946) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.812961) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.812963) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.812965) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:48:07 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.812967) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:48:07 compute-1 ceph-mon[80107]: pgmap v169: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:48:07 compute-1 sudo[106425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldsfcqktxchxupafnwufszizjfhisoif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420887.7123392-331-235584384759173/AnsiballZ_stat.py'
Jan 26 09:48:07 compute-1 sudo[106425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:08 compute-1 python3.9[106427]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:48:08 compute-1 sudo[106425]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:08 compute-1 sudo[106503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzmhgshhcpxipgrppakaerckxosgskon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420887.7123392-331-235584384759173/AnsiballZ_file.py'
Jan 26 09:48:08 compute-1 sudo[106503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:08 compute-1 python3.9[106505]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:48:08 compute-1 sudo[106503]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:08.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:08 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6e0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:08 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:08 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6ec002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:09.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:09 compute-1 sudo[106655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjfsaxoqaadyhebwyhdimdnfnmkeakcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420888.89639-367-162724009606571/AnsiballZ_systemd.py'
Jan 26 09:48:09 compute-1 sudo[106655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:09 compute-1 python3.9[106658]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:48:09 compute-1 systemd[1]: Reloading.
Jan 26 09:48:09 compute-1 systemd-rc-local-generator[106683]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:48:09 compute-1 systemd-sysv-generator[106688]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:48:09 compute-1 ceph-mon[80107]: pgmap v170: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:48:10 compute-1 sudo[106655]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:10 compute-1 sudo[106846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sahiytpkqbiamaflgiymznarputctzvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420890.384596-391-106544516425839/AnsiballZ_stat.py'
Jan 26 09:48:10 compute-1 sudo[106846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:10 compute-1 sudo[106849]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:48:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:10.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:10 compute-1 sudo[106849]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:48:10 compute-1 sudo[106849]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:10 compute-1 sudo[106874]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 09:48:10 compute-1 sudo[106874]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:48:10 compute-1 python3.9[106848]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:48:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:10 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7100096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:10 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7100096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:10 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:10 compute-1 sudo[106846]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:48:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:11.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:48:11 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:48:11 compute-1 sudo[106991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgaroatzrjouplaovbxiaredgoitactc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420890.384596-391-106544516425839/AnsiballZ_file.py'
Jan 26 09:48:11 compute-1 sudo[106991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:11 compute-1 sudo[106874]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:11 compute-1 python3.9[106993]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:48:11 compute-1 sudo[106991]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:11 compute-1 sudo[107158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyyvaorkzadrxeugqsabfknvrubawjqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420891.5620775-427-246972401649669/AnsiballZ_stat.py'
Jan 26 09:48:11 compute-1 sudo[107158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:11 compute-1 ceph-mon[80107]: pgmap v171: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:48:11 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:48:11 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 09:48:11 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:48:11 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:48:11 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 09:48:11 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 09:48:11 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:48:11 compute-1 python3.9[107160]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:48:12 compute-1 sudo[107158]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:12 compute-1 sudo[107236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojyuqdtnxetojoxduecweldbpghmizer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420891.5620775-427-246972401649669/AnsiballZ_file.py'
Jan 26 09:48:12 compute-1 sudo[107236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:12 compute-1 python3.9[107238]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:48:12 compute-1 sudo[107236]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:12.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:12 : epoch 6977383c : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:48:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:12 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:12 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6e0003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:12 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7100096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:13 compute-1 sudo[107388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owsbersbyusvlhiezmurbpczlegnwbtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420892.7153375-463-142813445818433/AnsiballZ_systemd.py'
Jan 26 09:48:13 compute-1 sudo[107388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:48:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:13.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:48:13 compute-1 python3.9[107390]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:48:13 compute-1 systemd[1]: Reloading.
Jan 26 09:48:13 compute-1 systemd-rc-local-generator[107417]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:48:13 compute-1 systemd-sysv-generator[107420]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:48:13 compute-1 systemd[1]: Starting Create netns directory...
Jan 26 09:48:13 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 26 09:48:13 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 26 09:48:13 compute-1 systemd[1]: Finished Create netns directory.
Jan 26 09:48:13 compute-1 sudo[107388]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:13 compute-1 ceph-mon[80107]: pgmap v172: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:48:14 compute-1 python3.9[107584]: ansible-ansible.builtin.service_facts Invoked
Jan 26 09:48:14 compute-1 network[107601]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 09:48:14 compute-1 network[107602]: 'network-scripts' will be removed from distribution in near future.
Jan 26 09:48:14 compute-1 network[107603]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 09:48:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:48:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:14.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:48:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:14 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:14 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:14 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6e0003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:15.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:15 compute-1 sudo[107627]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:48:15 compute-1 sudo[107627]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:48:15 compute-1 sudo[107627]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:15 compute-1 ceph-mon[80107]: pgmap v173: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 597 B/s wr, 2 op/s
Jan 26 09:48:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:15 : epoch 6977383c : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:48:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:15 : epoch 6977383c : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:48:16 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:48:16 compute-1 sudo[107652]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 09:48:16 compute-1 sudo[107652]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:48:16 compute-1 sudo[107652]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:16.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:16 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe71000a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:16 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:16 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:17.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:17 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:48:17 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:48:17 compute-1 ceph-mon[80107]: pgmap v174: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 597 B/s wr, 1 op/s
Jan 26 09:48:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:48:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:48:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:18.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:48:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:18 : epoch 6977383c : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 09:48:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:18 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6e0003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:18 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe71000a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:18 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:19.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:19 compute-1 ceph-mon[80107]: pgmap v175: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 597 B/s wr, 1 op/s
Jan 26 09:48:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:48:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:20.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:48:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:20 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:20 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6e0003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:20 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe71000a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:21.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:21 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:48:21 compute-1 ceph-mon[80107]: pgmap v176: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:48:22 compute-1 sudo[107917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klogjrqnwggcnsfpafebtxczkkxdtmca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420902.1139476-541-203281708220171/AnsiballZ_stat.py'
Jan 26 09:48:22 compute-1 sudo[107917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:22 compute-1 python3.9[107919]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:48:22 compute-1 sudo[107917]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:22.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:22 compute-1 sudo[107995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otlhftzqhqfkqeqvcmbzwurvyletxddr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420902.1139476-541-203281708220171/AnsiballZ_file.py'
Jan 26 09:48:22 compute-1 sudo[107995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:22 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe71000a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:22 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe71000a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:22 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6e0003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:23.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:23 compute-1 python3.9[107997]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:48:23 compute-1 sudo[107995]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:23 compute-1 ceph-mon[80107]: pgmap v177: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:48:23 compute-1 sudo[108148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzrogyypuiumzevxgfpzzojpftucpzak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420903.545681-580-88670137652673/AnsiballZ_file.py'
Jan 26 09:48:23 compute-1 sudo[108148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:24 compute-1 python3.9[108150]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:48:24 compute-1 sudo[108148]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:24 compute-1 sudo[108300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htacfjgfuxezqbfdkbvbaefmphsrfkbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420904.3048139-604-139921234738105/AnsiballZ_stat.py'
Jan 26 09:48:24 compute-1 sudo[108300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:24 compute-1 python3.9[108302]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:48:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:24.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:24 compute-1 sudo[108300]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:24 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:24 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:24 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe704001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:25 compute-1 sudo[108381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nltwxyfbuswncfxzjsknshsxitajdwvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420904.3048139-604-139921234738105/AnsiballZ_file.py'
Jan 26 09:48:25 compute-1 sudo[108381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:25.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:25 compute-1 python3.9[108383]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:48:25 compute-1 sudo[108381]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094825 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:48:25 compute-1 ceph-mon[80107]: pgmap v178: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:48:26 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:48:26 compute-1 sudo[108534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcktzyfjlpbczzrkogsbukkxuidkpoou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420905.7908294-649-236602458092325/AnsiballZ_timezone.py'
Jan 26 09:48:26 compute-1 sudo[108534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:26 compute-1 python3.9[108536]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 26 09:48:26 compute-1 systemd[1]: Starting Time & Date Service...
Jan 26 09:48:26 compute-1 systemd[1]: Started Time & Date Service.
Jan 26 09:48:26 compute-1 sudo[108534]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:26.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:26 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6e4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:26 compute-1 kernel: ganesha.nfsd[104453]: segfault at 50 ip 00007fe79115d32e sp 00007fe6f8ff8210 error 4 in libntirpc.so.5.8[7fe791142000+2c000] likely on CPU 0 (core 0, socket 0)
Jan 26 09:48:26 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 26 09:48:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:26 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6ec003c10 fd 38 proxy ignored for local
Jan 26 09:48:27 compute-1 systemd[1]: Started Process Core Dump (PID 108613/UID 0).
Jan 26 09:48:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:48:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:27.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:48:27 compute-1 sudo[108692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaswktvwkfpaornctxekfxmhxgmprnxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420906.919922-676-127182309446722/AnsiballZ_file.py'
Jan 26 09:48:27 compute-1 sudo[108692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:27 compute-1 python3.9[108694]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:48:27 compute-1 sudo[108692]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:27 compute-1 sudo[108845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kawmqnrbzkyrwlrdpbwglyjozgsagbrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420907.6703253-700-271630072411412/AnsiballZ_stat.py'
Jan 26 09:48:27 compute-1 sudo[108845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:27 compute-1 ceph-mon[80107]: pgmap v179: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 09:48:28 compute-1 python3.9[108847]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:48:28 compute-1 sudo[108845]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:28 compute-1 sudo[108923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hovwzdzgcdmzejshvibdfmgranynuorp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420907.6703253-700-271630072411412/AnsiballZ_file.py'
Jan 26 09:48:28 compute-1 sudo[108923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:28 compute-1 systemd-coredump[108618]: Process 103732 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 55:
                                                    #0  0x00007fe79115d32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Jan 26 09:48:28 compute-1 python3.9[108925]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:48:28 compute-1 systemd[1]: systemd-coredump@2-108613-0.service: Deactivated successfully.
Jan 26 09:48:28 compute-1 systemd[1]: systemd-coredump@2-108613-0.service: Consumed 1.524s CPU time.
Jan 26 09:48:28 compute-1 sudo[108923]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:28 compute-1 podman[108930]: 2026-01-26 09:48:28.739252573 +0000 UTC m=+0.027130971 container died cc086136e1c432f5ca4718ee9ab857dcab4db1a7efab79790d7b9f2c243517ef (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.build-date=20250325, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Jan 26 09:48:28 compute-1 systemd[1]: var-lib-containers-storage-overlay-ccab391d2ddf76ae48998672269732e3b03ef5875ece04442f9e9329de58aac3-merged.mount: Deactivated successfully.
Jan 26 09:48:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:28.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:28 compute-1 podman[108930]: 2026-01-26 09:48:28.853150325 +0000 UTC m=+0.141028723 container remove cc086136e1c432f5ca4718ee9ab857dcab4db1a7efab79790d7b9f2c243517ef (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1)
Jan 26 09:48:28 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Main process exited, code=exited, status=139/n/a
Jan 26 09:48:29 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Failed with result 'exit-code'.
Jan 26 09:48:29 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.763s CPU time.
Jan 26 09:48:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:29.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:29 compute-1 sudo[109123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnnjcfcbzgyxbmxcttdxhonxjfpclvpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420908.8712304-736-243653481759282/AnsiballZ_stat.py'
Jan 26 09:48:29 compute-1 sudo[109123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:29 compute-1 python3.9[109125]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:48:29 compute-1 sudo[109123]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:29 compute-1 sudo[109202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwiqwujxzpdqlasbbzccrsldkxoblreq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420908.8712304-736-243653481759282/AnsiballZ_file.py'
Jan 26 09:48:29 compute-1 sudo[109202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:29 compute-1 python3.9[109204]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.abg2wr3h recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:48:29 compute-1 sudo[109202]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:30 compute-1 ceph-mon[80107]: pgmap v180: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 09:48:30 compute-1 sudo[109354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofumwpxnynegfgcoauwwexzzlnrpzeqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420910.0577855-772-20045693464592/AnsiballZ_stat.py'
Jan 26 09:48:30 compute-1 sudo[109354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:30 compute-1 python3.9[109356]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:48:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:30.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:30 compute-1 sudo[109354]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:31.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:31 compute-1 sudo[109432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdkhvmwyksbskrgnjcviyejrsbsiqsev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420910.0577855-772-20045693464592/AnsiballZ_file.py'
Jan 26 09:48:31 compute-1 sudo[109432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:31 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:48:31 compute-1 python3.9[109434]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:48:31 compute-1 sudo[109432]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:32 compute-1 sudo[109585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqdrpakdqkcbhdnhfebygpqdwvtejosa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420911.6064038-811-195141882272727/AnsiballZ_command.py'
Jan 26 09:48:32 compute-1 sudo[109585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:32 compute-1 ceph-mon[80107]: pgmap v181: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 09:48:32 compute-1 python3.9[109587]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:48:32 compute-1 sudo[109585]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:32 compute-1 sshd-session[109588]: Invalid user admin from 104.248.198.71 port 40490
Jan 26 09:48:32 compute-1 sshd-session[109588]: Connection closed by invalid user admin 104.248.198.71 port 40490 [preauth]
Jan 26 09:48:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:32.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:32 compute-1 sudo[109740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxctbybgaknplybrueikjeakszjbvmgn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769420912.5174146-835-133999523798935/AnsiballZ_edpm_nftables_from_files.py'
Jan 26 09:48:32 compute-1 sudo[109740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094832 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:48:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:33.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:33 compute-1 python3[109742]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 26 09:48:33 compute-1 sudo[109740]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:33 compute-1 ceph-mon[80107]: pgmap v182: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 B/s wr, 0 op/s
Jan 26 09:48:33 compute-1 sudo[109893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kskcxacdfagabbxjbuqsyiwqpyhibvpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420913.4025474-859-177251587961620/AnsiballZ_stat.py'
Jan 26 09:48:33 compute-1 sudo[109893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:33 compute-1 python3.9[109895]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:48:34 compute-1 sudo[109893]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:48:34 compute-1 sudo[109971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-poybmpzhxueaifjluexrfjmsvlblknhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420913.4025474-859-177251587961620/AnsiballZ_file.py'
Jan 26 09:48:34 compute-1 sudo[109971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:34 compute-1 python3.9[109973]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:48:34 compute-1 sudo[109971]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:34.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:34 compute-1 sudo[110123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcmgyygsxpthrhpxgavtnyrfjlzayzry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420914.6683211-895-169080307139033/AnsiballZ_stat.py'
Jan 26 09:48:34 compute-1 sudo[110123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:35 compute-1 python3.9[110125]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:48:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:48:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:35.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:48:35 compute-1 sudo[110123]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:35 compute-1 ceph-mon[80107]: pgmap v183: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 26 09:48:35 compute-1 sudo[110249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfuhnfrdqijqdyonrqvluzyejzxmceul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420914.6683211-895-169080307139033/AnsiballZ_copy.py'
Jan 26 09:48:35 compute-1 sudo[110249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:35 compute-1 sudo[110252]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:48:35 compute-1 sudo[110252]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:48:35 compute-1 sudo[110252]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:35 compute-1 python3.9[110251]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420914.6683211-895-169080307139033/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:48:35 compute-1 sudo[110249]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:36 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:48:36 compute-1 sudo[110428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obckhnbmwnhowyfkblailtsoqyujbrga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420916.0765884-940-115346501464419/AnsiballZ_stat.py'
Jan 26 09:48:36 compute-1 sudo[110428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:36 compute-1 python3.9[110430]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:48:36 compute-1 sudo[110428]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:48:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:36.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:48:36 compute-1 sudo[110506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxrbrwtjwpeswlovydcsdkwsmdynbxsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420916.0765884-940-115346501464419/AnsiballZ_file.py'
Jan 26 09:48:36 compute-1 sudo[110506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:48:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:37.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:48:37 compute-1 sshd-session[110277]: Connection closed by authenticating user root 152.42.136.110 port 45418 [preauth]
Jan 26 09:48:37 compute-1 python3.9[110508]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:48:37 compute-1 sudo[110506]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:37 compute-1 sudo[110659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvwgbwwyxiovyzmqgqenqdynndjpijol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420917.364688-976-116712415978361/AnsiballZ_stat.py'
Jan 26 09:48:37 compute-1 sudo[110659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:37 compute-1 ceph-mon[80107]: pgmap v184: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:48:37 compute-1 python3.9[110661]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:48:37 compute-1 sudo[110659]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:38 compute-1 sudo[110737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zszmnqmnxmaehdicvhaodxtbhzfqgavj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420917.364688-976-116712415978361/AnsiballZ_file.py'
Jan 26 09:48:38 compute-1 sudo[110737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:38 compute-1 python3.9[110739]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:48:38 compute-1 sudo[110737]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:48:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:38.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:48:38 compute-1 sudo[110889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-labmvlqhwhejglmcrtvixbwdftwmycgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420918.5870295-1012-97227623292151/AnsiballZ_stat.py'
Jan 26 09:48:38 compute-1 sudo[110889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:39 compute-1 python3.9[110891]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:48:39 compute-1 sudo[110889]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:39.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:39 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Scheduled restart job, restart counter is at 3.
Jan 26 09:48:39 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:48:39 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.763s CPU time.
Jan 26 09:48:39 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 09:48:39 compute-1 sudo[110998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhrjavrhnznkocebqqbwfmgcviykhlmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420918.5870295-1012-97227623292151/AnsiballZ_file.py'
Jan 26 09:48:39 compute-1 sudo[110998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:39 compute-1 podman[111017]: 2026-01-26 09:48:39.405756298 +0000 UTC m=+0.038692540 container create 1a7be39caead16f4589c4b559c326459ed0a4d717b062f3a71c6fb265aca0cb8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 09:48:39 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b935da6dc4aeecbe481a98d1071edd21957b8052ba6272d61816dfce610d2b23/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 09:48:39 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b935da6dc4aeecbe481a98d1071edd21957b8052ba6272d61816dfce610d2b23/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 09:48:39 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b935da6dc4aeecbe481a98d1071edd21957b8052ba6272d61816dfce610d2b23/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 09:48:39 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b935da6dc4aeecbe481a98d1071edd21957b8052ba6272d61816dfce610d2b23/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 09:48:39 compute-1 podman[111017]: 2026-01-26 09:48:39.469197153 +0000 UTC m=+0.102133415 container init 1a7be39caead16f4589c4b559c326459ed0a4d717b062f3a71c6fb265aca0cb8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325, OSD_FLAVOR=default)
Jan 26 09:48:39 compute-1 podman[111017]: 2026-01-26 09:48:39.473909824 +0000 UTC m=+0.106846066 container start 1a7be39caead16f4589c4b559c326459ed0a4d717b062f3a71c6fb265aca0cb8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True)
Jan 26 09:48:39 compute-1 bash[111017]: 1a7be39caead16f4589c4b559c326459ed0a4d717b062f3a71c6fb265aca0cb8
Jan 26 09:48:39 compute-1 podman[111017]: 2026-01-26 09:48:39.387255377 +0000 UTC m=+0.020191639 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:48:39 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:48:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:39 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 09:48:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:39 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 09:48:39 compute-1 python3.9[111006]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:48:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:39 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 26 09:48:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:39 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 26 09:48:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:39 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 26 09:48:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:39 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 26 09:48:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:39 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 26 09:48:39 compute-1 sudo[110998]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:39 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:48:39 compute-1 ceph-mon[80107]: pgmap v185: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:48:40 compute-1 sudo[111224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqpnshgcbgrgobrxxzzkiquhhcgiwnbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420920.0094454-1051-17629928843974/AnsiballZ_command.py'
Jan 26 09:48:40 compute-1 sudo[111224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:40 compute-1 python3.9[111226]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:48:40 compute-1 sudo[111224]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:40.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:41.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:41 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:48:41 compute-1 sudo[111379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxjuunlggqnogbzjfefgqqjkgqdasviu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420920.7020166-1075-254515407497353/AnsiballZ_blockinfile.py'
Jan 26 09:48:41 compute-1 sudo[111379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:41 compute-1 python3.9[111381]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:48:41 compute-1 sudo[111379]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:41 compute-1 ceph-mon[80107]: pgmap v186: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:48:42 compute-1 sudo[111532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrcmyhzbgqdychsrzcucgybfiaxdilpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420921.8597775-1102-97816126064615/AnsiballZ_file.py'
Jan 26 09:48:42 compute-1 sudo[111532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:42 compute-1 python3.9[111534]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:48:42 compute-1 sudo[111532]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:42 compute-1 sudo[111684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czittmqggazwrbqykwravmitpdlbtoei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420922.453202-1102-9415513730950/AnsiballZ_file.py'
Jan 26 09:48:42 compute-1 sudo[111684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:42.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:43.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:43 compute-1 ceph-mon[80107]: pgmap v187: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:48:43 compute-1 python3.9[111686]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:48:43 compute-1 sudo[111684]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:44 compute-1 sudo[111837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dexjgjdyibhtjzjugzhshuunyrungxzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420923.6638782-1147-637209409833/AnsiballZ_mount.py'
Jan 26 09:48:44 compute-1 sudo[111837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:44 compute-1 python3.9[111839]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 26 09:48:44 compute-1 sudo[111837]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:44 compute-1 sudo[111989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojkvxvmaeeskgqsjhcxfgtsbspmxkwdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420924.4351535-1147-271516883056968/AnsiballZ_mount.py'
Jan 26 09:48:44 compute-1 sudo[111989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:48:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:44.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:48:44 compute-1 python3.9[111991]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 26 09:48:44 compute-1 sudo[111989]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:45.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:45 compute-1 sshd-session[104485]: Connection closed by 192.168.122.30 port 57200
Jan 26 09:48:45 compute-1 sshd-session[104481]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:48:45 compute-1 systemd[1]: session-43.scope: Deactivated successfully.
Jan 26 09:48:45 compute-1 systemd[1]: session-43.scope: Consumed 29.700s CPU time.
Jan 26 09:48:45 compute-1 systemd-logind[783]: Session 43 logged out. Waiting for processes to exit.
Jan 26 09:48:45 compute-1 systemd-logind[783]: Removed session 43.
Jan 26 09:48:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:45 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:48:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:45 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:48:45 compute-1 ceph-mon[80107]: pgmap v188: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 26 09:48:46 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:48:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:46.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:47.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:47 compute-1 ceph-mon[80107]: pgmap v189: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 26 09:48:48 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:48:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:48.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:49.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:49 compute-1 ceph-mon[80107]: pgmap v190: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 26 09:48:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:48:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:50.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:48:51 compute-1 sshd-session[112019]: Accepted publickey for zuul from 192.168.122.30 port 43278 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:48:51 compute-1 systemd-logind[783]: New session 44 of user zuul.
Jan 26 09:48:51 compute-1 systemd[1]: Started Session 44 of User zuul.
Jan 26 09:48:51 compute-1 sshd-session[112019]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:48:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:48:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:51.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:48:51 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 09:48:51 compute-1 sudo[112180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwlcipirbjtqrmobhpwmbwxsnjkbzjpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420931.1294544-19-174433542273292/AnsiballZ_tempfile.py'
Jan 26 09:48:51 compute-1 sudo[112180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094851 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:48:51 compute-1 ceph-mon[80107]: pgmap v191: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 2 op/s
Jan 26 09:48:51 compute-1 python3.9[112187]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 26 09:48:51 compute-1 sudo[112180]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:52 compute-1 sudo[112337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avlihpvoamqphhjdinmpoffisgzgudjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420932.2869818-55-72981112762222/AnsiballZ_stat.py'
Jan 26 09:48:52 compute-1 sudo[112337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:52.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:52 compute-1 python3.9[112339]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:48:52 compute-1 sudo[112337]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:52 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f578c000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:53 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57780016e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:53 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:48:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:53.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:48:53 compute-1 sudo[112497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lckkszsukbysmdnfgpkvnoteprablaoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420933.153352-79-100718433779324/AnsiballZ_slurp.py'
Jan 26 09:48:53 compute-1 sudo[112497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:53 compute-1 ceph-mon[80107]: pgmap v192: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 852 B/s wr, 2 op/s
Jan 26 09:48:53 compute-1 python3.9[112499]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Jan 26 09:48:53 compute-1 sudo[112497]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:54 compute-1 sudo[112649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wytpxvqowtfydwdvdawsqkkpoqnwrbbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420934.1628942-103-51088690058875/AnsiballZ_stat.py'
Jan 26 09:48:54 compute-1 sudo[112649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:54 compute-1 sshd-session[112393]: Connection closed by authenticating user root 178.62.249.31 port 38512 [preauth]
Jan 26 09:48:54 compute-1 python3.9[112651]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.or619ee0 follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:48:54 compute-1 sudo[112649]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:48:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:54.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:48:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:54 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5784001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094855 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:48:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:55 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5790001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:55 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:55.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:55 compute-1 sudo[112774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdjpnsitcvoskqxrpxydzltrfvwdtree ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420934.1628942-103-51088690058875/AnsiballZ_copy.py'
Jan 26 09:48:55 compute-1 sudo[112774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:55 compute-1 python3.9[112776]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.or619ee0 mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769420934.1628942-103-51088690058875/.source.or619ee0 _original_basename=.9b015ofb follow=False checksum=e638a8a1231bcbc6594aeda119d676a260ed9e9f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:48:55 compute-1 sudo[112774]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:55 compute-1 ceph-mon[80107]: pgmap v193: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 852 B/s wr, 2 op/s
Jan 26 09:48:55 compute-1 sudo[112854]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:48:55 compute-1 sudo[112854]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:48:55 compute-1 sudo[112854]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:56 compute-1 sudo[112952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrtrybqcfwusglpdjvnjsixtfdfhwcle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420935.6318042-148-82777823216186/AnsiballZ_setup.py'
Jan 26 09:48:56 compute-1 sudo[112952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:56 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:48:56 compute-1 python3.9[112954]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:48:56 compute-1 sudo[112952]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:56 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 26 09:48:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:48:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:56.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:48:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:57 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57700016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:57 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57700016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:57 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57900023e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:48:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:57.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:48:57 compute-1 sudo[113106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmajotyfhxrphchcsdhrxoaockaqsjik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420936.814063-173-262171745691488/AnsiballZ_blockinfile.py'
Jan 26 09:48:57 compute-1 sudo[113106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:57 compute-1 python3.9[113108]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC0TZpcPGqQPKNdLKsJSWd1uRV3wOVDiIo3gYwVWAuH5m+Wvpw34ZI+6+d4y3DWMqDRZVWAVV0NNFB+b4MQeivx4S7KMCvBctzJ6VIyUDL5NZrwys0sYPH+33ncdZd6C8LrfCvIct+DbWCx72RQ+G0yRbYK1r/m5+dzW2411NqWn8kJkBUeLJIqT2vhFoNpO8NaWSVlWEgl5YunYEPS4v5NSM88ke6Gzc5X5sjxsz65REj6/1BXsA+quwcTAe/KC1/1Rr2cufefwf0uayM6sGuUDATjWIw36YqUeL9wc/IDdIEFEvj2hr/v+r6laaKMidOYJXBiQwIWpgWCOosSj4vrPQmDfqjOa8sAn7yWPVgxyARccavEO89zV2lpFcYTdqegPxjB90lD3Q1pMU6veJUWTRo0LAZ6n9rsRBgF0Mhr75T32Lbqf3KBro6/nPrp1XCD08mNv2cEYwp+put7vwvHzN1nPztqMsIDAMJMupwI+Buyr3xCPHe3hcAavahF+YM=
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAINbUUMKlV4hksqDn2YVVAHPCHip80h7zj0rReM94Ja2l
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFtD30BOt1BlR6BYm8DU7sxF5fAzZ/aciKetiRsXWlbsXS3Z4mVG1ZAF9AhArV+OaapsLeaQFybIC0e2fudJfos=
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCyi0WEBS9Gc5Xay4vqFSdv0cJGdtezg+CrNF/vjEeF3l4EhpAAj7XRLEhEU1kz0DDKkzclG65hBNPO4/9cfzEa31EsSmzOqjqZp5ri20HVDkiZlUTTklhrbJGydUw6mcy+rIN1qsUugVHwkA9ufZLvzm9wvljzL+WPt1o41GT42NdNzyfPfnqf7HMDziNUNUUZjqsoy+DQnlMl3c3NHiGysPJ6IssbLBCFzPdBHpEYmR8b44qlJEhx3RYWl3QLcXAyoK7VpPdFO4ltMT+0KVVbLO9IUrocCQ4HfafPn/mV1Rq3phDWvCTRfRo07Mu4Oc4XBu+RIk9tt1WTIdT/ZusPUNSkFgprdU9zFIHLR0KyIX4qRSuWBeB20Ic5pvkRvNtwLB8lPt4NVi7bmun6moO8nu6cOjJ61CCAobDSEL/Z2cG3ADucjCSKtWLM0eSdt6T71NmULMhdB8ljIK4em/NCf/qZWjYr70WKyIZ9b8N5lDO8NF1tbPJyu+O0ebq/JN8=
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAaib//yQ1QyvWijjfui4OBtTtMt7Dos+hlx8rucs2Tn
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBP7YXsQWyEQWSdy5tcEAtltn11CwuaqW/S8S3OB1580hTlcLZWLPDHbzSwNDf13HBG9wgLFgmueLB8U6J7wvvcM=
                                             compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDm+Vrn31pimz+Of4pkRaSS+qazCMrOF2INZ0EZsyoNG5922K2xwdC9F6r4k2L54HPEpDiazPoDsOHQvs1I+CvayNM2D+8hZhvqxZOMimP8b056aM14nht9ADrJUnlaDs57FkgIKQdxma9I0sW8Up3bbLchFOj2grOjH7gRdUBxblzIS01/P5NV8/kPsRXDoCgx+QAxU2nEqyCQd0JXLKoy+v6t+pG7We9wFXXr2z4XmAx7yeU0Y6NsJ1Seies0apLTmfK3HAtj/3LObvZegqVGDFtl5spotTmJdPJUCZhniaUmyYZ4jtIEno86Bf8OhS3NvLsxmNXuJcInlmCHGXDP9FPBrxG+yVB63FUAeyejCXntEyOzXFp8fiCuOVQuqDTWB4UxTRYh3EqVruxhY1taarew/VfsxIAxv6BWsqtvh/6xtRtJ9vTSDHsDTRaOcChfT5BnATFJ+Ilwpve8C4bjRVdlStH+99TgtNPOg2Fxf8scyIHInM9c4Yn7g8YTiyk=
                                             compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICrJdFptF1rp2hjeKcc0nSEhHvDtAYFU4gfqZN6U+WTb
                                             compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNa2lKVjuYCljd0rl1qDkTP3ZoTV9fkbcXvtxSizwygrF6dU+RWdeB3LOkT5U/2GTJuWvOqxJBc3Y1d0b3Dj5Do=
                                              create=True mode=0644 path=/tmp/ansible.or619ee0 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:48:57 compute-1 sudo[113106]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:57 compute-1 ceph-mon[80107]: pgmap v194: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 341 B/s wr, 1 op/s
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.851291) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420937851370, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 742, "num_deletes": 251, "total_data_size": 1444235, "memory_usage": 1472544, "flush_reason": "Manual Compaction"}
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420937861433, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 953382, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12466, "largest_seqno": 13203, "table_properties": {"data_size": 949852, "index_size": 1374, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 7899, "raw_average_key_size": 18, "raw_value_size": 942764, "raw_average_value_size": 2239, "num_data_blocks": 61, "num_entries": 421, "num_filter_entries": 421, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420887, "oldest_key_time": 1769420887, "file_creation_time": 1769420937, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 10212 microseconds, and 5596 cpu microseconds.
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.861506) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 953382 bytes OK
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.861528) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.862789) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.862803) EVENT_LOG_v1 {"time_micros": 1769420937862800, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.862816) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 1440326, prev total WAL file size 1440326, number of live WAL files 2.
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.863330) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(931KB)], [24(14MB)]
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420937863404, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 16286279, "oldest_snapshot_seqno": -1}
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4232 keys, 12878908 bytes, temperature: kUnknown
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420937931401, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 12878908, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12847661, "index_size": 19553, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10629, "raw_key_size": 108391, "raw_average_key_size": 25, "raw_value_size": 12767355, "raw_average_value_size": 3016, "num_data_blocks": 828, "num_entries": 4232, "num_filter_entries": 4232, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769420937, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.931901) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 12878908 bytes
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.955570) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 238.3 rd, 188.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 14.6 +0.0 blob) out(12.3 +0.0 blob), read-write-amplify(30.6) write-amplify(13.5) OK, records in: 4748, records dropped: 516 output_compression: NoCompression
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.955616) EVENT_LOG_v1 {"time_micros": 1769420937955597, "job": 12, "event": "compaction_finished", "compaction_time_micros": 68345, "compaction_time_cpu_micros": 27868, "output_level": 6, "num_output_files": 1, "total_output_size": 12878908, "num_input_records": 4748, "num_output_records": 4232, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420937955907, "job": 12, "event": "table_file_deletion", "file_number": 26}
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420937958645, "job": 12, "event": "table_file_deletion", "file_number": 24}
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.863254) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.958751) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.958760) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.958763) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.958766) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:48:57 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.958770) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:48:58 compute-1 sudo[113259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bddjxafoswupvhqzpouqgurlufdghnfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420937.6938267-197-209627092067914/AnsiballZ_command.py'
Jan 26 09:48:58 compute-1 sudo[113259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:58 compute-1 python3.9[113261]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.or619ee0' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:48:58 compute-1 sudo[113259]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:58.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:58 compute-1 sudo[113413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyqlvmhtdaerpkzwgdufwlvszhdumsac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420938.539466-221-110340674564563/AnsiballZ_file.py'
Jan 26 09:48:58 compute-1 sudo[113413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:48:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:59 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:59 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57700016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:59 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5784002520 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:48:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:48:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:48:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:59.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:48:59 compute-1 python3.9[113415]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.or619ee0 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:48:59 compute-1 sudo[113413]: pam_unix(sudo:session): session closed for user root
Jan 26 09:48:59 compute-1 sshd-session[112022]: Connection closed by 192.168.122.30 port 43278
Jan 26 09:48:59 compute-1 sshd-session[112019]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:48:59 compute-1 systemd[1]: session-44.scope: Deactivated successfully.
Jan 26 09:48:59 compute-1 systemd[1]: session-44.scope: Consumed 4.991s CPU time.
Jan 26 09:48:59 compute-1 systemd-logind[783]: Session 44 logged out. Waiting for processes to exit.
Jan 26 09:48:59 compute-1 systemd-logind[783]: Removed session 44.
Jan 26 09:48:59 compute-1 ceph-mon[80107]: pgmap v195: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 341 B/s wr, 1 op/s
Jan 26 09:49:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:00 : epoch 69773877 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:49:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:00.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:01 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57900023e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:01 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:01 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57700016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:01.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:01 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:49:01 compute-1 ceph-mon[80107]: pgmap v196: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 09:49:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:02.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:03 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57700016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:03 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5784002e40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:03 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57700016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:03.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:03 : epoch 69773877 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:49:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:03 : epoch 69773877 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:49:03 compute-1 ceph-mon[80107]: pgmap v197: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:49:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:49:04 compute-1 sshd-session[113446]: Accepted publickey for zuul from 192.168.122.30 port 40994 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:49:04 compute-1 systemd-logind[783]: New session 45 of user zuul.
Jan 26 09:49:04 compute-1 systemd[1]: Started Session 45 of User zuul.
Jan 26 09:49:04 compute-1 sshd-session[113446]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:49:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:04.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:05 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5768000e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:05 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:05 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:05.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:05 compute-1 python3.9[113599]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:49:06 compute-1 ceph-mon[80107]: pgmap v198: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 09:49:06 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:49:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:06 : epoch 69773877 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 09:49:06 compute-1 sudo[113754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odbsditkgrqwhlbwgejmoqhtlfoxfvzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420946.2018661-52-197784414118678/AnsiballZ_systemd.py'
Jan 26 09:49:06 compute-1 sudo[113754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:06.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:07 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:07 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:07 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5768001920 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:07 compute-1 python3.9[113756]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 26 09:49:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:07.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:07 compute-1 sudo[113754]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:07 compute-1 sudo[113909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhvpirtlvktsqttcxpabvtqummitdxmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420947.4600506-76-45089025544501/AnsiballZ_systemd.py'
Jan 26 09:49:07 compute-1 sudo[113909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:08 compute-1 python3.9[113911]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 09:49:08 compute-1 sudo[113909]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:08 compute-1 ceph-mon[80107]: pgmap v199: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 2 op/s
Jan 26 09:49:08 compute-1 sudo[114062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdzywljusgjgkarqzemmdvsttbrxmggo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420948.4581516-103-66775438437537/AnsiballZ_command.py'
Jan 26 09:49:08 compute-1 sudo[114062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:08.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:09 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:09 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5760000d00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:09 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c001cc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:09 compute-1 python3.9[114064]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:49:09 compute-1 sudo[114062]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:09.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:09 compute-1 ceph-mon[80107]: pgmap v200: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 2 op/s
Jan 26 09:49:09 compute-1 sudo[114217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbpluohyuwveatkduammwewstbiqnnzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420949.427105-127-58941264172657/AnsiballZ_stat.py'
Jan 26 09:49:09 compute-1 sudo[114217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:10 compute-1 python3.9[114219]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:49:10 compute-1 sudo[114217]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:10 compute-1 sudo[114369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwtgbzrnrlqxacpdzhqthyugnwekwjlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420950.2926116-154-140339486345499/AnsiballZ_file.py'
Jan 26 09:49:10 compute-1 sudo[114369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:10 compute-1 python3.9[114371]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:49:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:10.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:10 compute-1 sudo[114369]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:11 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57700038f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:11 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5760001820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:11 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:11.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:11 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:49:11 compute-1 sshd-session[113449]: Connection closed by 192.168.122.30 port 40994
Jan 26 09:49:11 compute-1 sshd-session[113446]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:49:11 compute-1 systemd[1]: session-45.scope: Deactivated successfully.
Jan 26 09:49:11 compute-1 systemd[1]: session-45.scope: Consumed 3.665s CPU time.
Jan 26 09:49:11 compute-1 systemd-logind[783]: Session 45 logged out. Waiting for processes to exit.
Jan 26 09:49:11 compute-1 systemd-logind[783]: Removed session 45.
Jan 26 09:49:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094911 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:49:11 compute-1 ceph-mon[80107]: pgmap v201: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:49:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:12.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:13 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c001cc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:13 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57700038f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:13 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5760001820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:13.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:13 compute-1 ceph-mon[80107]: pgmap v202: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 26 09:49:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:14.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:15 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:15 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:15 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57700038f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:15.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:15 compute-1 ceph-mon[80107]: pgmap v203: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 26 09:49:15 compute-1 sudo[114399]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:49:15 compute-1 sudo[114399]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:49:15 compute-1 sudo[114399]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:16 compute-1 sshd-session[114424]: Accepted publickey for zuul from 192.168.122.30 port 43338 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:49:16 compute-1 systemd-logind[783]: New session 46 of user zuul.
Jan 26 09:49:16 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:49:16 compute-1 systemd[1]: Started Session 46 of User zuul.
Jan 26 09:49:16 compute-1 sshd-session[114424]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:49:16 compute-1 sudo[114500]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:49:16 compute-1 sudo[114500]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:49:16 compute-1 sudo[114500]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:16 compute-1 sudo[114546]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 09:49:16 compute-1 sudo[114546]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:49:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:49:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:16.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:49:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:17 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5760001820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:17 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:17 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:17.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:17 compute-1 python3.9[114627]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:49:17 compute-1 sudo[114546]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:18 compute-1 sudo[114813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khbhylubspctjbikvwhpnhrezitfbuha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420957.7385197-58-34175862547403/AnsiballZ_setup.py'
Jan 26 09:49:18 compute-1 sudo[114813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:18 compute-1 ceph-mon[80107]: pgmap v204: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:49:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:49:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:49:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:49:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 09:49:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:49:18 compute-1 python3.9[114815]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 09:49:18 compute-1 sudo[114813]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:18.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:18 compute-1 sudo[114897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndxxljzvqoemqqdvyfkhoqkqqlsseszw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420957.7385197-58-34175862547403/AnsiballZ_dnf.py'
Jan 26 09:49:18 compute-1 sudo[114897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:19 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003a70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:19 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5760001820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:19 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c002e10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:49:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 09:49:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 09:49:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:49:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:49:19 compute-1 python3.9[114899]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 09:49:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:49:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:19.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:49:20 compute-1 ceph-mon[80107]: pgmap v205: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:49:20 compute-1 sudo[114897]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:20.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:21 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:21 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:21 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600030a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:21.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:21 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:49:21 compute-1 python3.9[115053]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:49:22 compute-1 sshd-session[114978]: Connection closed by authenticating user root 152.42.136.110 port 50216 [preauth]
Jan 26 09:49:22 compute-1 ceph-mon[80107]: pgmap v206: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:49:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:49:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:22.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:49:22 compute-1 sudo[115155]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 09:49:22 compute-1 sudo[115155]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:49:22 compute-1 sudo[115155]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:23 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c003730 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:23 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003a70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:23 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:23.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:23 compute-1 python3.9[115230]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 09:49:23 compute-1 ceph-mon[80107]: pgmap v207: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:49:23 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:49:23 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 09:49:23 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:49:24 compute-1 python3.9[115382]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:49:24 compute-1 python3.9[115532]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:49:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:24.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:25 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:25 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c003730 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:25 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003a70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:25.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:25 compute-1 sshd-session[114427]: Connection closed by 192.168.122.30 port 43338
Jan 26 09:49:25 compute-1 sshd-session[114424]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:49:25 compute-1 systemd[1]: session-46.scope: Deactivated successfully.
Jan 26 09:49:25 compute-1 systemd[1]: session-46.scope: Consumed 6.149s CPU time.
Jan 26 09:49:25 compute-1 systemd-logind[783]: Session 46 logged out. Waiting for processes to exit.
Jan 26 09:49:25 compute-1 systemd-logind[783]: Removed session 46.
Jan 26 09:49:25 compute-1 sshd-session[115533]: Invalid user admin from 104.248.198.71 port 56474
Jan 26 09:49:25 compute-1 sshd-session[115533]: Connection closed by invalid user admin 104.248.198.71 port 56474 [preauth]
Jan 26 09:49:25 compute-1 ceph-mon[80107]: pgmap v208: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:49:26 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:49:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:26.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:27 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003a70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:27 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:27 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c003730 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:27.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:28 compute-1 ceph-mon[80107]: pgmap v209: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:49:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:49:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:28.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:49:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:29 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003a70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:29 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:29 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:49:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:29.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:49:29 compute-1 ceph-mon[80107]: pgmap v210: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:49:30 compute-1 sshd-session[115562]: Accepted publickey for zuul from 192.168.122.30 port 42480 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:49:30 compute-1 systemd-logind[783]: New session 47 of user zuul.
Jan 26 09:49:30 compute-1 systemd[1]: Started Session 47 of User zuul.
Jan 26 09:49:30 compute-1 sshd-session[115562]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:49:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:49:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:30.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:49:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:31 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c003730 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:31 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003a70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:31 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:31 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:49:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:49:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:31.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:49:31 compute-1 python3.9[115715]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:49:31 compute-1 ceph-mon[80107]: pgmap v211: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 26 09:49:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:32.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:33 compute-1 sudo[115870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnxbttyzzpzjvathptriudpwfkjyrjkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420972.6187232-107-187487120416663/AnsiballZ_file.py'
Jan 26 09:49:33 compute-1 sudo[115870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:33 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:33 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c003730 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:33 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003a70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:33 compute-1 python3.9[115872]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:49:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:33.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:33 compute-1 sudo[115870]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:33 compute-1 sudo[116023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gthidihilmvqdbksfunfbayhezqjiqhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420973.3846943-107-117039653161456/AnsiballZ_file.py'
Jan 26 09:49:33 compute-1 sudo[116023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:33 compute-1 ceph-mon[80107]: pgmap v212: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:49:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:49:33 compute-1 python3.9[116025]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:49:33 compute-1 sudo[116023]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:34 compute-1 sudo[116175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smclzgjjrebeszsgimpklmumzohwzqtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420974.070227-149-96041922822908/AnsiballZ_stat.py'
Jan 26 09:49:34 compute-1 sudo[116175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:34 compute-1 python3.9[116177]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:49:34 compute-1 sudo[116175]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:49:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:34.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:49:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:35 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:35 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:35 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:35.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:35 compute-1 sudo[116299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmmnxnbxvgzkezyllccnnuzetubjcuuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420974.070227-149-96041922822908/AnsiballZ_copy.py'
Jan 26 09:49:35 compute-1 sudo[116299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:35 compute-1 python3.9[116301]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420974.070227-149-96041922822908/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=bea7362bba0757b80ba784e089b88d8ab88c30d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:49:35 compute-1 sudo[116299]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:35 compute-1 ceph-mon[80107]: pgmap v213: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:49:35 compute-1 sudo[116452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuwbjuzpkyidscehdxmfzymjpixmzcsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420975.650681-149-203657932792799/AnsiballZ_stat.py'
Jan 26 09:49:35 compute-1 sudo[116452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:35 compute-1 sudo[116453]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:49:35 compute-1 sudo[116453]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:49:36 compute-1 sudo[116453]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:36 compute-1 python3.9[116467]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:49:36 compute-1 sudo[116452]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:36 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:49:36 compute-1 sudo[116600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgescknlhktoldipisvurjytneddtcaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420975.650681-149-203657932792799/AnsiballZ_copy.py'
Jan 26 09:49:36 compute-1 sudo[116600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:36 compute-1 python3.9[116602]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420975.650681-149-203657932792799/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=679f96efe917c3889f556f17807e671104af52ec backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:49:36 compute-1 sudo[116600]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:49:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:36.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:49:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:37 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57840020e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:37 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:37 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c003730 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:37 compute-1 sudo[116752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvuuzqvastqtrzznwddakzzdavhjyskz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420976.9063451-149-223982833928585/AnsiballZ_stat.py'
Jan 26 09:49:37 compute-1 sudo[116752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:37.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:37 compute-1 python3.9[116754]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:49:37 compute-1 sudo[116752]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:37 compute-1 sudo[116876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdzucmstsbfdawwfozyzzbuylmbpmisn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420976.9063451-149-223982833928585/AnsiballZ_copy.py'
Jan 26 09:49:37 compute-1 sudo[116876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:37 compute-1 ceph-mon[80107]: pgmap v214: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:49:37 compute-1 python3.9[116878]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420976.9063451-149-223982833928585/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=cb3c2f57a5234f1a1ac2af8e96022b0fed20bf3d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:49:37 compute-1 sudo[116876]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:38 compute-1 sudo[117028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxugxkgdbxbzqynkiahndjpnlninkwic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420978.1917777-287-172182538980904/AnsiballZ_file.py'
Jan 26 09:49:38 compute-1 sudo[117028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:38 compute-1 python3.9[117030]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:49:38 compute-1 sudo[117028]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:38.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:39 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:39 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57840020e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:39 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:39 compute-1 sudo[117180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xppxsaiathwxwoytnlfnoaieyvmheypk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420978.8556106-287-265668741615189/AnsiballZ_file.py'
Jan 26 09:49:39 compute-1 sudo[117180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:39.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:39 compute-1 python3.9[117182]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:49:39 compute-1 sudo[117180]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:39 compute-1 sudo[117333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzttujormbymoiwwgcimqbazebwspvev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420979.5086622-330-264527805643681/AnsiballZ_stat.py'
Jan 26 09:49:39 compute-1 sudo[117333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094939 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:49:39 compute-1 python3.9[117335]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:49:39 compute-1 ceph-mon[80107]: pgmap v215: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:49:39 compute-1 sudo[117333]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:40 compute-1 sudo[117456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aekdgrnxqhgdpqobeathiguyaguckgtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420979.5086622-330-264527805643681/AnsiballZ_copy.py'
Jan 26 09:49:40 compute-1 sudo[117456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:40 compute-1 python3.9[117458]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420979.5086622-330-264527805643681/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=9cafdd1360dab1b4d60bfa8e86f8276e76706ef0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:49:40 compute-1 sudo[117456]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:40 compute-1 sudo[117608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhhbhcekqfkuxhnchfnyetbmpmuekbnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420980.6596706-330-225209645518720/AnsiballZ_stat.py'
Jan 26 09:49:40 compute-1 sudo[117608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:40.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:41 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c003730 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:41 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:41 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57680011d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:41 compute-1 python3.9[117610]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:49:41 compute-1 sudo[117608]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:41 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:49:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:41.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:41 compute-1 sudo[117733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdwedolmbvvmozitiafcxlyabqmjszop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420980.6596706-330-225209645518720/AnsiballZ_copy.py'
Jan 26 09:49:41 compute-1 sudo[117733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:41 compute-1 python3.9[117735]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420980.6596706-330-225209645518720/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=3190a221de07992f337d3e4a96f47a3d3dd4b35b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:49:41 compute-1 sudo[117733]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:42 compute-1 ceph-mon[80107]: pgmap v216: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 09:49:42 compute-1 sudo[117885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfpyxjyjvrkgstkcunqznhrlsjnebfwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420981.886361-330-114505023901714/AnsiballZ_stat.py'
Jan 26 09:49:42 compute-1 sudo[117885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:42 compute-1 python3.9[117887]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:49:42 compute-1 sudo[117885]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:42 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094942 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:49:42 compute-1 sudo[118008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blvlbtzfxiywefjsjtfdgpohoygurdts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420981.886361-330-114505023901714/AnsiballZ_copy.py'
Jan 26 09:49:42 compute-1 sudo[118008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:42.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:43.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:43 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:43 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 50 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:43 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c003730 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:43 compute-1 ceph-mon[80107]: pgmap v217: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Jan 26 09:49:44 compute-1 python3.9[118011]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420981.886361-330-114505023901714/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=93aa6b8988d194c9c88d8fb17c0ff48744cb9801 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:49:44 compute-1 sudo[118008]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:44 compute-1 sshd-session[118010]: Connection closed by authenticating user root 178.62.249.31 port 43238 [preauth]
Jan 26 09:49:44 compute-1 sudo[118168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbxprmuwfcpiyrirlkoepsvlhepfffos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420984.3158193-480-277964156879901/AnsiballZ_file.py'
Jan 26 09:49:44 compute-1 sudo[118168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:44 compute-1 python3.9[118170]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:49:44 compute-1 sudo[118168]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:44.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:45 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:45 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57840020e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:45 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003c80 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:45 compute-1 sudo[118320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uplkrqzqwzcmfnvidspyayyjuxcbkopv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420984.9499156-480-155445881609893/AnsiballZ_file.py'
Jan 26 09:49:45 compute-1 sudo[118320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:45.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:45 compute-1 python3.9[118322]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:49:45 compute-1 sudo[118320]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:45 compute-1 sudo[118473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtggbudgxkfzjmsjuvlztomdjbuhplmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420985.6209319-528-122858067030783/AnsiballZ_stat.py'
Jan 26 09:49:45 compute-1 sudo[118473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:46 compute-1 ceph-mon[80107]: pgmap v218: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Jan 26 09:49:46 compute-1 python3.9[118475]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:49:46 compute-1 sudo[118473]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:46 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:49:46 compute-1 sudo[118596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ialsqsctmyxcpexpvlhsmvcbviebwmfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420985.6209319-528-122858067030783/AnsiballZ_copy.py'
Jan 26 09:49:46 compute-1 sudo[118596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:46 compute-1 python3.9[118598]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420985.6209319-528-122858067030783/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=4415a399963cf83213fc16f9ed5bbf28ef091eab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:49:46 compute-1 sudo[118596]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:49:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:46.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:49:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:47 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:47 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:47 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5784002ea0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:47 compute-1 sudo[118748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbdrlqnhtinbjqbdklldudadkvayimpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420986.7746-528-105226788449037/AnsiballZ_stat.py'
Jan 26 09:49:47 compute-1 sudo[118748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:47.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:47 compute-1 python3.9[118750]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:49:47 compute-1 sudo[118748]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:47 compute-1 sudo[118872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsfcdbkosneiwmnhtlyxgcrdxzebbaxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420986.7746-528-105226788449037/AnsiballZ_copy.py'
Jan 26 09:49:47 compute-1 sudo[118872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:47 compute-1 python3.9[118874]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420986.7746-528-105226788449037/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=3190a221de07992f337d3e4a96f47a3d3dd4b35b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:49:47 compute-1 sudo[118872]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:48 compute-1 ceph-mon[80107]: pgmap v219: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Jan 26 09:49:48 compute-1 sudo[119024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qleltpvrhjllwzjrtmpuhjkcdkiluigs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420987.9584496-528-137666898163810/AnsiballZ_stat.py'
Jan 26 09:49:48 compute-1 sudo[119024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:48 compute-1 python3.9[119026]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:49:48 compute-1 sudo[119024]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:48 compute-1 sudo[119147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzmdqfcfyrghcmehkcjejrlqqigfdwxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420987.9584496-528-137666898163810/AnsiballZ_copy.py'
Jan 26 09:49:48 compute-1 sudo[119147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:48 compute-1 python3.9[119149]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420987.9584496-528-137666898163810/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=d0eed1af3b55a18e8c6edf61338ef59ffee7ebc2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:49:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:49:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:48.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:49:48 compute-1 sudo[119147]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:49 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003ca0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:49 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778003cd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:49 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:49 : epoch 69773877 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:49:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:49:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:49.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:50 compute-1 sudo[119300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdsbtdrldtetaohnqqvjukzamvpqirsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420989.6909049-698-65191359825645/AnsiballZ_file.py'
Jan 26 09:49:50 compute-1 sudo[119300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:50 compute-1 python3.9[119302]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:49:50 compute-1 sudo[119300]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:50 compute-1 ceph-mon[80107]: pgmap v220: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Jan 26 09:49:50 compute-1 sudo[119452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xioxuhongtivamaghwyjzzminmtuvfse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420990.3987231-722-228210027792333/AnsiballZ_stat.py'
Jan 26 09:49:50 compute-1 sudo[119452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:50 compute-1 python3.9[119454]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:49:50 compute-1 sudo[119452]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:50.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5784002ea0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003cc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778003cd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:51 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:49:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:51.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:51 compute-1 sudo[119575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okebigaptccgmbjodzkavxhvzprxwefp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420990.3987231-722-228210027792333/AnsiballZ_copy.py'
Jan 26 09:49:51 compute-1 sudo[119575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:51 compute-1 ceph-mon[80107]: pgmap v221: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 511 B/s wr, 1 op/s
Jan 26 09:49:51 compute-1 python3.9[119577]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420990.3987231-722-228210027792333/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a4f71bf0609e75a0e091c9100076ae4c4a7bed4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:49:51 compute-1 sudo[119575]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:52 compute-1 sudo[119728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qubqyrctmnmewlhrdpxaqqotxpyhgqei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420991.8083587-773-7896775490227/AnsiballZ_file.py'
Jan 26 09:49:52 compute-1 sudo[119728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:52 : epoch 69773877 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:49:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:52 : epoch 69773877 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:49:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:52 : epoch 69773877 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:49:52 compute-1 python3.9[119730]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:49:52 compute-1 sudo[119728]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:52 compute-1 sudo[119880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrnlappburkvnpbapfxeykqvlijimtpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420992.509027-797-269279461498938/AnsiballZ_stat.py'
Jan 26 09:49:52 compute-1 sudo[119880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:52.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:53 compute-1 python3.9[119882]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:49:53 compute-1 sudo[119880]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:53 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:53 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5784003bb0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:53 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003ce0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:53.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:53 compute-1 sudo[120004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izvyskvtytpdyjsokswhnxnewqojzpfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420992.509027-797-269279461498938/AnsiballZ_copy.py'
Jan 26 09:49:53 compute-1 sudo[120004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:53 compute-1 python3.9[120006]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420992.509027-797-269279461498938/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a4f71bf0609e75a0e091c9100076ae4c4a7bed4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:49:53 compute-1 sudo[120004]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:54 compute-1 ceph-mon[80107]: pgmap v222: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 511 B/s wr, 1 op/s
Jan 26 09:49:54 compute-1 sudo[120156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhwyjlfmbhnpuehrccrogrgsfcnpocfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420993.9327617-846-163970448020533/AnsiballZ_file.py'
Jan 26 09:49:54 compute-1 sudo[120156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:54 compute-1 python3.9[120158]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:49:54 compute-1 sudo[120156]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:54 : epoch 69773877 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:49:54 compute-1 sudo[120308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmjbyccmadnwwspumzzkhsvhtbyuswuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420994.6720812-872-67717865404424/AnsiballZ_stat.py'
Jan 26 09:49:54 compute-1 sudo[120308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:54.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:55 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778003cd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:55 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:55 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5784003bb0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:55 compute-1 python3.9[120310]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:49:55 compute-1 sudo[120308]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:55.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:55 compute-1 sudo[120432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnlbcnjpexsdpkhqlwbzosacvvjtwval ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420994.6720812-872-67717865404424/AnsiballZ_copy.py'
Jan 26 09:49:55 compute-1 sudo[120432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:55 compute-1 python3.9[120434]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420994.6720812-872-67717865404424/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a4f71bf0609e75a0e091c9100076ae4c4a7bed4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:49:55 compute-1 sudo[120432]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:56 compute-1 sudo[120482]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:49:56 compute-1 sudo[120482]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:49:56 compute-1 sudo[120482]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:56 compute-1 ceph-mon[80107]: pgmap v223: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 09:49:56 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:49:56 compute-1 sudo[120609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpevztepuoasobibksrtexgbafnperpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420996.0110612-922-185541592912690/AnsiballZ_file.py'
Jan 26 09:49:56 compute-1 sudo[120609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:56 compute-1 python3.9[120611]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:49:56 compute-1 sudo[120609]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:56.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:57 compute-1 sudo[120761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vggmswdfjsgyzqhxhtpcxbatopbyzylp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420996.695903-944-24203934788152/AnsiballZ_stat.py'
Jan 26 09:49:57 compute-1 sudo[120761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:57 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003d00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:57 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778003cd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:57 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:57 compute-1 python3.9[120763]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:49:57 compute-1 sudo[120761]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:57.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:57 : epoch 69773877 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 09:49:57 compute-1 sudo[120885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulvtanbqnyxzpeprhauxzxttdplvejvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420996.695903-944-24203934788152/AnsiballZ_copy.py'
Jan 26 09:49:57 compute-1 sudo[120885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:57 compute-1 python3.9[120887]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420996.695903-944-24203934788152/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a4f71bf0609e75a0e091c9100076ae4c4a7bed4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:49:57 compute-1 sudo[120885]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:58 compute-1 ceph-mon[80107]: pgmap v224: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 09:49:58 compute-1 sudo[121037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgbngcfnnvnrjrgcnbfyguizcuinlvrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420998.0858555-989-91573312523086/AnsiballZ_file.py'
Jan 26 09:49:58 compute-1 sudo[121037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:58 compute-1 python3.9[121039]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:49:58 compute-1 sudo[121037]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:49:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:58.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:49:58 compute-1 sudo[121189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpneyfvcwbfrgqxdpthdsmfkkbpbrdxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420998.7195146-1011-28607873709536/AnsiballZ_stat.py'
Jan 26 09:49:58 compute-1 sudo[121189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:59 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5784003bb0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:59 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003d20 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:59 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778003cd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:49:59 compute-1 python3.9[121191]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:49:59 compute-1 sudo[121189]: pam_unix(sudo:session): session closed for user root
Jan 26 09:49:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:49:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:49:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:59.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:49:59 compute-1 sudo[121313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slodcmeasxocspjjkjzlmwfvfzfqulhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420998.7195146-1011-28607873709536/AnsiballZ_copy.py'
Jan 26 09:49:59 compute-1 sudo[121313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:49:59 compute-1 python3.9[121315]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420998.7195146-1011-28607873709536/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a4f71bf0609e75a0e091c9100076ae4c4a7bed4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:49:59 compute-1 sudo[121313]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:00 compute-1 ceph-mon[80107]: pgmap v225: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 09:50:00 compute-1 ceph-mon[80107]: overall HEALTH_OK
Jan 26 09:50:00 compute-1 sudo[121465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnkapuhloxzeiywrnioinwvdocybcldq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769420999.9971538-1055-208842185542466/AnsiballZ_file.py'
Jan 26 09:50:00 compute-1 sudo[121465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:00 compute-1 python3.9[121467]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:50:00 compute-1 sudo[121465]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:50:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:00.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:50:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:01 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:01 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57840048c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:01 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003d40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:01 compute-1 sudo[121617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sohkzukkjolnnmhqvztdfzzuerqgqzot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421000.9023566-1079-141695714015866/AnsiballZ_stat.py'
Jan 26 09:50:01 compute-1 sudo[121617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:01 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:50:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:50:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:01.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:50:01 compute-1 python3.9[121619]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:50:01 compute-1 sudo[121617]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:01 compute-1 ceph-mon[80107]: pgmap v226: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.1 KiB/s rd, 1.1 KiB/s wr, 4 op/s
Jan 26 09:50:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095001 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:50:01 compute-1 sudo[121741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skhticdftcwxnblggmrvtbkueiiewzeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421000.9023566-1079-141695714015866/AnsiballZ_copy.py'
Jan 26 09:50:01 compute-1 sudo[121741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:02 compute-1 python3.9[121743]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421000.9023566-1079-141695714015866/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a4f71bf0609e75a0e091c9100076ae4c4a7bed4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:50:02 compute-1 sudo[121741]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:02 compute-1 sshd-session[115565]: Connection closed by 192.168.122.30 port 42480
Jan 26 09:50:02 compute-1 sshd-session[115562]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:50:02 compute-1 systemd[1]: session-47.scope: Deactivated successfully.
Jan 26 09:50:02 compute-1 systemd[1]: session-47.scope: Consumed 23.606s CPU time.
Jan 26 09:50:02 compute-1 systemd-logind[783]: Session 47 logged out. Waiting for processes to exit.
Jan 26 09:50:02 compute-1 systemd-logind[783]: Removed session 47.
Jan 26 09:50:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:02.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:03 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778003cd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:03 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:03 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:03.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:03 compute-1 ceph-mon[80107]: pgmap v227: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 597 B/s wr, 2 op/s
Jan 26 09:50:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:50:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095004 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:50:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:04.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:05 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003d60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:05 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778003cd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:05 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:05.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:05 compute-1 ceph-mon[80107]: pgmap v228: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 597 B/s wr, 2 op/s
Jan 26 09:50:06 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:50:06 compute-1 sshd-session[121770]: Connection closed by authenticating user root 152.42.136.110 port 45640 [preauth]
Jan 26 09:50:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:06.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:07 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57840048c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:07 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003d80 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:07 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778003cd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:50:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:07.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:50:07 compute-1 sshd-session[121773]: Accepted publickey for zuul from 192.168.122.30 port 44056 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:50:07 compute-1 systemd-logind[783]: New session 48 of user zuul.
Jan 26 09:50:07 compute-1 systemd[1]: Started Session 48 of User zuul.
Jan 26 09:50:07 compute-1 sshd-session[121773]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:50:08 compute-1 ceph-mon[80107]: pgmap v229: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 170 B/s wr, 1 op/s
Jan 26 09:50:08 compute-1 sudo[121926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmzcrsbxyqmzrqykdcyltmtjvaquraud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421007.9144723-22-254283956755502/AnsiballZ_file.py'
Jan 26 09:50:08 compute-1 sudo[121926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:08 compute-1 python3.9[121928]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:50:08 compute-1 sudo[121926]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:08.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:09 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778003cd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:09 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57840048c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:09 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003da0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:09.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:09 compute-1 sudo[122078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbsttfgrvikibsvbizjmfkuepvfdrzgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421008.860517-58-130279010580214/AnsiballZ_stat.py'
Jan 26 09:50:09 compute-1 sudo[122078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:09 compute-1 python3.9[122080]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:50:09 compute-1 sudo[122078]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:10 compute-1 ceph-mon[80107]: pgmap v230: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 170 B/s wr, 1 op/s
Jan 26 09:50:10 compute-1 sudo[122202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsfchbcwlcvitflphdepirveyysfdpgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421008.860517-58-130279010580214/AnsiballZ_copy.py'
Jan 26 09:50:10 compute-1 sudo[122202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:10 compute-1 python3.9[122204]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421008.860517-58-130279010580214/.source.conf _original_basename=ceph.conf follow=False checksum=d9847d470420fd34212d6cc1f2ab891aeddd27f2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:50:10 compute-1 sudo[122202]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:50:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:10.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:50:11 compute-1 sudo[122354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yookytykdsyliikiqmdzzzypemfllyna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421010.506798-58-88342665704759/AnsiballZ_stat.py'
Jan 26 09:50:11 compute-1 sudo[122354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:11 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:11 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778003cd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:11 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57840048c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:11 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:50:11 compute-1 python3.9[122356]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:50:11 compute-1 sudo[122354]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:11.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:11 compute-1 sudo[122478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktdmkaqlawgvrzriqtcowqmdfvcvkizj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421010.506798-58-88342665704759/AnsiballZ_copy.py'
Jan 26 09:50:11 compute-1 sudo[122478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:11 compute-1 python3.9[122480]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421010.506798-58-88342665704759/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=e8137016e459ec15b04fac1b40fd6c611375a3cb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:50:11 compute-1 sudo[122478]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:12 compute-1 ceph-mon[80107]: pgmap v231: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 170 B/s wr, 1 op/s
Jan 26 09:50:12 compute-1 sshd-session[121776]: Connection closed by 192.168.122.30 port 44056
Jan 26 09:50:12 compute-1 sshd-session[121773]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:50:12 compute-1 systemd[1]: session-48.scope: Deactivated successfully.
Jan 26 09:50:12 compute-1 systemd[1]: session-48.scope: Consumed 2.859s CPU time.
Jan 26 09:50:12 compute-1 systemd-logind[783]: Session 48 logged out. Waiting for processes to exit.
Jan 26 09:50:12 compute-1 systemd-logind[783]: Removed session 48.
Jan 26 09:50:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:12.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:13 compute-1 kernel: ganesha.nfsd[116231]: segfault at 50 ip 00007f581686d32e sp 00007f57a17f9210 error 4 in libntirpc.so.5.8[7f5816852000+2c000] likely on CPU 7 (core 0, socket 7)
Jan 26 09:50:13 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 26 09:50:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:13 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57840048c0 fd 47 proxy ignored for local
Jan 26 09:50:13 compute-1 systemd[1]: Started Process Core Dump (PID 122505/UID 0).
Jan 26 09:50:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:13.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:14 compute-1 ceph-mon[80107]: pgmap v232: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 26 09:50:14 compute-1 systemd-coredump[122506]: Process 111037 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 57:
                                                    #0  0x00007f581686d32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Jan 26 09:50:14 compute-1 systemd[1]: systemd-coredump@3-122505-0.service: Deactivated successfully.
Jan 26 09:50:14 compute-1 systemd[1]: systemd-coredump@3-122505-0.service: Consumed 1.246s CPU time.
Jan 26 09:50:14 compute-1 podman[122512]: 2026-01-26 09:50:14.48765948 +0000 UTC m=+0.047493320 container died 1a7be39caead16f4589c4b559c326459ed0a4d717b062f3a71c6fb265aca0cb8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1)
Jan 26 09:50:14 compute-1 systemd[1]: var-lib-containers-storage-overlay-b935da6dc4aeecbe481a98d1071edd21957b8052ba6272d61816dfce610d2b23-merged.mount: Deactivated successfully.
Jan 26 09:50:14 compute-1 podman[122512]: 2026-01-26 09:50:14.532331632 +0000 UTC m=+0.092165502 container remove 1a7be39caead16f4589c4b559c326459ed0a4d717b062f3a71c6fb265aca0cb8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1)
Jan 26 09:50:14 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Main process exited, code=exited, status=139/n/a
Jan 26 09:50:14 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Failed with result 'exit-code'.
Jan 26 09:50:14 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.691s CPU time.
Jan 26 09:50:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:14.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:50:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:15.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:50:16 compute-1 ceph-mon[80107]: pgmap v233: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 26 09:50:16 compute-1 sudo[122556]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:50:16 compute-1 sudo[122556]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:50:16 compute-1 sudo[122556]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:16 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:50:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:16.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:50:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:17.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:50:17 compute-1 sshd-session[122582]: Accepted publickey for zuul from 192.168.122.30 port 40358 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:50:17 compute-1 systemd-logind[783]: New session 49 of user zuul.
Jan 26 09:50:17 compute-1 systemd[1]: Started Session 49 of User zuul.
Jan 26 09:50:17 compute-1 sshd-session[122582]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:50:18 compute-1 ceph-mon[80107]: pgmap v234: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:50:18 compute-1 sshd-session[122623]: Invalid user admin from 104.248.198.71 port 50806
Jan 26 09:50:18 compute-1 sshd-session[122623]: Connection closed by invalid user admin 104.248.198.71 port 50806 [preauth]
Jan 26 09:50:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:18.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095019 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:50:19 compute-1 python3.9[122737]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:50:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:50:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:19.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:20 compute-1 sudo[122892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jynziruliqxvhynhuvaxpiuydypnuttl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421019.7054267-58-150118501262630/AnsiballZ_file.py'
Jan 26 09:50:20 compute-1 sudo[122892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:20 compute-1 ceph-mon[80107]: pgmap v235: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:50:20 compute-1 python3.9[122894]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:50:20 compute-1 sudo[122892]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:20 compute-1 sudo[123044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlrtsbqmhrkhfdtvvngyzwntuypcomab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421020.486417-58-72707116128282/AnsiballZ_file.py'
Jan 26 09:50:20 compute-1 sudo[123044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:20 compute-1 python3.9[123046]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:50:20 compute-1 sudo[123044]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:20.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:21 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:50:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:21.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:21 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 09:50:21 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Cumulative writes: 6316 writes, 26K keys, 6316 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 6316 writes, 1069 syncs, 5.91 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 6316 writes, 26K keys, 6316 commit groups, 1.0 writes per commit group, ingest: 19.65 MB, 0.03 MB/s
                                           Interval WAL: 6316 writes, 1069 syncs, 5.91 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Jan 26 09:50:21 compute-1 python3.9[123197]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:50:22 compute-1 ceph-mon[80107]: pgmap v236: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 09:50:22 compute-1 sudo[123347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uawtkinarqpnqihialxnmoqxijzyylyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421022.1958778-127-273137077069737/AnsiballZ_seboolean.py'
Jan 26 09:50:22 compute-1 sudo[123347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:22 compute-1 python3.9[123349]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 26 09:50:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:50:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:23.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:50:23 compute-1 sudo[123350]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:50:23 compute-1 sudo[123350]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:50:23 compute-1 sudo[123350]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:23 compute-1 sudo[123375]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Jan 26 09:50:23 compute-1 sudo[123375]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:50:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:23.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:23 compute-1 ceph-mon[80107]: pgmap v237: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:50:23 compute-1 sudo[123375]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:23 compute-1 sudo[123422]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:50:23 compute-1 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 26 09:50:23 compute-1 sudo[123422]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:50:23 compute-1 sudo[123422]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:23 compute-1 sudo[123447]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 09:50:23 compute-1 sudo[123447]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:50:24 compute-1 sudo[123347]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:24 compute-1 sudo[123447]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:24 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:50:24 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:50:24 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:50:24 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:50:24 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:50:24 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 09:50:24 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:50:24 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:50:24 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 09:50:24 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 09:50:24 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:50:24 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Scheduled restart job, restart counter is at 4.
Jan 26 09:50:24 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:50:24 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.691s CPU time.
Jan 26 09:50:24 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 09:50:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:25.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:25 compute-1 sudo[123669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqyhoptsrnbynhgkgyolikxwjxqgticy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421024.4703622-157-163548285213886/AnsiballZ_setup.py'
Jan 26 09:50:25 compute-1 sudo[123669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:25 compute-1 podman[123702]: 2026-01-26 09:50:25.205778927 +0000 UTC m=+0.043673385 container create a0e54dfdc80639d6a858a302b9137bdb7eb4ae31fc0c5c95cbf6faee0fa517e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:50:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50ad3cd59e5c5d65126cb5de338a6d9e374cbf09eb3d2dd944710879fec095a1/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 09:50:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50ad3cd59e5c5d65126cb5de338a6d9e374cbf09eb3d2dd944710879fec095a1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 09:50:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50ad3cd59e5c5d65126cb5de338a6d9e374cbf09eb3d2dd944710879fec095a1/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 09:50:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50ad3cd59e5c5d65126cb5de338a6d9e374cbf09eb3d2dd944710879fec095a1/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 09:50:25 compute-1 podman[123702]: 2026-01-26 09:50:25.275063302 +0000 UTC m=+0.112957760 container init a0e54dfdc80639d6a858a302b9137bdb7eb4ae31fc0c5c95cbf6faee0fa517e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 09:50:25 compute-1 podman[123702]: 2026-01-26 09:50:25.280020048 +0000 UTC m=+0.117914506 container start a0e54dfdc80639d6a858a302b9137bdb7eb4ae31fc0c5c95cbf6faee0fa517e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 09:50:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:25 compute-1 podman[123702]: 2026-01-26 09:50:25.189026698 +0000 UTC m=+0.026921176 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:50:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:25.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:25 compute-1 bash[123702]: a0e54dfdc80639d6a858a302b9137bdb7eb4ae31fc0c5c95cbf6faee0fa517e4
Jan 26 09:50:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 09:50:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 09:50:25 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:50:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 26 09:50:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 26 09:50:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 26 09:50:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 26 09:50:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 26 09:50:25 compute-1 python3.9[123680]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 09:50:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:50:25 compute-1 sudo[123669]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:25 compute-1 ceph-mon[80107]: pgmap v238: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:50:26 compute-1 sudo[123843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqrbhkadmgrafmfppvvhhxrwpkgtfntb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421024.4703622-157-163548285213886/AnsiballZ_dnf.py'
Jan 26 09:50:26 compute-1 sudo[123843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:26 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:50:26 compute-1 python3.9[123845]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:50:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:27.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:27.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:27 compute-1 sudo[123843]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:27 compute-1 ceph-mon[80107]: pgmap v239: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:50:28 compute-1 sudo[123997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzsmkncnotgydzcgxxolvsvghltonbwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421027.9599447-193-135908672650464/AnsiballZ_systemd.py'
Jan 26 09:50:28 compute-1 sudo[123997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:28 compute-1 python3.9[123999]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 09:50:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:50:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:29.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:50:29 compute-1 sudo[123997]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:29.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:29 compute-1 sudo[124154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tailkndmmjqqplpjpysgnmfkoammkuot ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769421029.3881574-217-46836520543627/AnsiballZ_edpm_nftables_snippet.py'
Jan 26 09:50:29 compute-1 sudo[124154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:30 compute-1 python3[124156]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                             rule:
                                               proto: udp
                                               dport: 4789
                                           - rule_name: 119 neutron geneve networks
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               state: ["UNTRACKED"]
                                           - rule_name: 120 neutron geneve networks no conntrack
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               table: raw
                                               chain: OUTPUT
                                               jump: NOTRACK
                                               action: append
                                               state: []
                                           - rule_name: 121 neutron geneve networks no conntrack
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               table: raw
                                               chain: PREROUTING
                                               jump: NOTRACK
                                               action: append
                                               state: []
                                            dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 26 09:50:30 compute-1 sudo[124154]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:30 compute-1 ceph-mon[80107]: pgmap v240: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:50:30 compute-1 sudo[124210]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 09:50:30 compute-1 sudo[124210]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:50:30 compute-1 sudo[124210]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:30 compute-1 sudo[124332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmlsytetdflolzlnonpvevnauktuuqtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421030.4770539-244-194252877019791/AnsiballZ_file.py'
Jan 26 09:50:30 compute-1 sudo[124332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:31.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:31 compute-1 python3.9[124334]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:50:31 compute-1 sudo[124332]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:31 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:50:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:50:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:31.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:50:31 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:50:31 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:50:31 compute-1 ceph-mon[80107]: pgmap v241: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 26 09:50:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:50:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:50:31 compute-1 sudo[124486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvzbpdqvclcgjumgevwkmztpkogjvfva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421031.3068733-268-51390776398609/AnsiballZ_stat.py'
Jan 26 09:50:31 compute-1 sudo[124486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:31 compute-1 sshd-session[124308]: Connection closed by authenticating user root 178.62.249.31 port 32920 [preauth]
Jan 26 09:50:31 compute-1 python3.9[124488]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:50:32 compute-1 sudo[124486]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:32 compute-1 sudo[124564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axnxxfpiexnfcrqnqxqtrtmdtwtxuywb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421031.3068733-268-51390776398609/AnsiballZ_file.py'
Jan 26 09:50:32 compute-1 sudo[124564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:32 compute-1 python3.9[124566]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:50:32 compute-1 sudo[124564]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:33.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:50:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:33.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:50:33 compute-1 sudo[124716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzwnbymfxobijrjfvjaycnaxhjlxuboi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421032.924049-304-190006843002426/AnsiballZ_stat.py'
Jan 26 09:50:33 compute-1 sudo[124716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:33 compute-1 python3.9[124718]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:50:33 compute-1 sudo[124716]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:33 compute-1 ceph-mon[80107]: pgmap v242: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 26 09:50:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:50:33 compute-1 sudo[124795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvxvqkjgwrnbydlmpdobygwqvvuuftes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421032.924049-304-190006843002426/AnsiballZ_file.py'
Jan 26 09:50:33 compute-1 sudo[124795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:34 compute-1 python3.9[124797]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.mqvi4rxi recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:50:34 compute-1 sudo[124795]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:34 compute-1 sudo[124947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxyjhmmulusyzyvlspojlchfzjybgsbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421034.3442683-340-24758705277237/AnsiballZ_stat.py'
Jan 26 09:50:34 compute-1 sudo[124947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:34 compute-1 python3.9[124949]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:50:34 compute-1 sudo[124947]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:35.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:35 compute-1 sudo[125025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfdoxqgqfjjenundnwatlrfwxkgucice ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421034.3442683-340-24758705277237/AnsiballZ_file.py'
Jan 26 09:50:35 compute-1 sudo[125025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:50:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:35.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:50:35 compute-1 python3.9[125027]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:50:35 compute-1 sudo[125025]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:35 compute-1 ceph-mon[80107]: pgmap v243: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:50:36 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:50:36 compute-1 sudo[125152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:50:36 compute-1 sudo[125152]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:50:36 compute-1 sudo[125152]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:36 compute-1 sudo[125203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlhcaluqbadkrpehhcesfppzyolsoild ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421035.738959-379-251736268036008/AnsiballZ_command.py'
Jan 26 09:50:36 compute-1 sudo[125203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:36 compute-1 python3.9[125205]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:50:36 compute-1 sudo[125203]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:37.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:50:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:37.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:50:37 compute-1 sudo[125356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uauhiqnprhenrzshkhcirscmmpxaarpc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769421036.8495345-403-199593379066659/AnsiballZ_edpm_nftables_from_files.py'
Jan 26 09:50:37 compute-1 sudo[125356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 26 09:50:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 09:50:37 compute-1 python3[125358]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 26 09:50:37 compute-1 sudo[125356]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:37 compute-1 ceph-mon[80107]: pgmap v244: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:50:38 compute-1 sudo[125521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csgyidrmnigyyrvcuielzyjimejrdmmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421037.945402-427-65948198053841/AnsiballZ_stat.py'
Jan 26 09:50:38 compute-1 sudo[125521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:38 compute-1 python3.9[125523]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:50:38 compute-1 sudo[125521]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:50:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:39.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:50:39 compute-1 sudo[125646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eptxcytjpnmmbkzhheirfurpirhwzpaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421037.945402-427-65948198053841/AnsiballZ_copy.py'
Jan 26 09:50:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:39 compute-1 sudo[125646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:39 compute-1 python3.9[125653]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421037.945402-427-65948198053841/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:50:39 compute-1 sudo[125646]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:50:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:39.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:50:39 compute-1 ceph-mon[80107]: pgmap v245: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:50:40 compute-1 sudo[125804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvezaqjqfsidepxqmqtxhoijvyyiamwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421039.5579093-472-108190636931665/AnsiballZ_stat.py'
Jan 26 09:50:40 compute-1 sudo[125804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:40 compute-1 python3.9[125806]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:50:40 compute-1 sudo[125804]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:41 compute-1 sudo[125929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyfifhydhzlmweindxwygvlqvhzpkeco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421039.5579093-472-108190636931665/AnsiballZ_copy.py'
Jan 26 09:50:41 compute-1 sudo[125929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:50:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:41.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:50:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095041 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:50:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:41 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:50:41 compute-1 python3.9[125931]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421039.5579093-472-108190636931665/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:50:41 compute-1 sudo[125929]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000054s ======
Jan 26 09:50:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:41.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 26 09:50:41 compute-1 sudo[126082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glmezcyupdblzjcrduxaowqauaoejjbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421041.4479537-517-53879095357632/AnsiballZ_stat.py'
Jan 26 09:50:41 compute-1 sudo[126082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:42 compute-1 python3.9[126084]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:50:42 compute-1 sudo[126082]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:42 compute-1 ceph-mon[80107]: pgmap v246: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:50:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:50:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:43.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:50:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff200016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:43 compute-1 sudo[126207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwsvffcwyfxyurvauutsvzuuipvazixw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421041.4479537-517-53879095357632/AnsiballZ_copy.py'
Jan 26 09:50:43 compute-1 sudo[126207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c0023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:43.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:43 compute-1 python3.9[126209]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421041.4479537-517-53879095357632/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:50:43 compute-1 sudo[126207]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:44 compute-1 sudo[126360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pahggpuhwedmaugcrfsojsnziougqfcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421043.6279104-562-50147066338492/AnsiballZ_stat.py'
Jan 26 09:50:44 compute-1 sudo[126360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:44 compute-1 ceph-mon[80107]: pgmap v247: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 09:50:44 compute-1 python3.9[126362]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:50:44 compute-1 sudo[126360]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:44 compute-1 sudo[126485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfxpftkrauhnetazggohiocbmjufdpke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421043.6279104-562-50147066338492/AnsiballZ_copy.py'
Jan 26 09:50:44 compute-1 sudo[126485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:44 compute-1 python3.9[126487]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421043.6279104-562-50147066338492/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:50:44 compute-1 sudo[126485]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:45.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff200016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:50:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:45.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:50:45 compute-1 sudo[126638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhabnidqadtfwfgqvgonkjttakcfqylx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421045.2046874-607-156071656345633/AnsiballZ_stat.py'
Jan 26 09:50:45 compute-1 sudo[126638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:45 compute-1 python3.9[126640]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:50:45 compute-1 sudo[126638]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:46 compute-1 ceph-mon[80107]: pgmap v248: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 09:50:46 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:50:46 compute-1 sudo[126763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vddetxtpitmdeuqvvdnlbrlaaivvzart ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421045.2046874-607-156071656345633/AnsiballZ_copy.py'
Jan 26 09:50:46 compute-1 sudo[126763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:46 compute-1 python3.9[126765]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421045.2046874-607-156071656345633/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:50:46 compute-1 sudo[126763]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:47.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff480091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:47 compute-1 sudo[126915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfahjeoigawqhwhdfmfmfnqjxqmbnqml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421046.8271182-652-110928519629046/AnsiballZ_file.py'
Jan 26 09:50:47 compute-1 sudo[126915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:47.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:47 compute-1 python3.9[126917]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:50:47 compute-1 sudo[126915]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:48 compute-1 ceph-mon[80107]: pgmap v249: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 26 09:50:48 compute-1 sudo[127068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjwkrraweywzhqwyvzdurrbzpzvsalpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421047.9108465-676-71144278215441/AnsiballZ_command.py'
Jan 26 09:50:48 compute-1 sudo[127068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:48 compute-1 python3.9[127070]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:50:48 compute-1 sudo[127068]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:50:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:49.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:50:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff200016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:50:49 compute-1 sudo[127223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjakamwgzntlnocucrutgtwkgkvlvikd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421048.7509952-700-51394392987802/AnsiballZ_blockinfile.py'
Jan 26 09:50:49 compute-1 sudo[127223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:49.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:49 compute-1 python3.9[127225]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:50:49 compute-1 sudo[127223]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:50 compute-1 sudo[127378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-borfibmjduklojopskalhqthcayuhpvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421049.8689132-727-10278847113007/AnsiballZ_command.py'
Jan 26 09:50:50 compute-1 sudo[127378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:50 compute-1 python3.9[127380]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:50:50 compute-1 sudo[127378]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:50 compute-1 ceph-mon[80107]: pgmap v250: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 26 09:50:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:51.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff480091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:51 compute-1 sudo[127531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxnslhmrfwwnwngydgzozctkpljjlvjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421050.8583975-751-9066545253887/AnsiballZ_stat.py'
Jan 26 09:50:51 compute-1 sudo[127531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:51 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:50:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:50:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:51.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:50:51 compute-1 python3.9[127533]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:50:51 compute-1 ceph-mon[80107]: pgmap v251: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 26 09:50:51 compute-1 sudo[127531]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:51 compute-1 sshd-session[127251]: Connection closed by authenticating user root 152.42.136.110 port 34476 [preauth]
Jan 26 09:50:52 compute-1 sudo[127686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzpmwgzhmickyjphkdibvzurriyutbax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421051.7182846-775-162822586306652/AnsiballZ_command.py'
Jan 26 09:50:52 compute-1 sudo[127686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:52 compute-1 python3.9[127688]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:50:52 compute-1 sudo[127686]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:53.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:53 compute-1 sudo[127841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmmkoffrvfwuzeulsmhecgawbfhkzhqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421052.7063992-799-163039674727828/AnsiballZ_file.py'
Jan 26 09:50:53 compute-1 sudo[127841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:53 compute-1 python3.9[127843]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:50:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:53.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:53 compute-1 sudo[127841]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:53 compute-1 ceph-mon[80107]: pgmap v252: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:50:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095053 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:50:54 compute-1 python3.9[127994]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:50:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:50:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:55.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:50:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:50:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:55.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:50:55 compute-1 ceph-mon[80107]: pgmap v253: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 09:50:55 compute-1 sudo[128146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejcihacdduppvosnikzgseanojwvdedv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421055.5157802-919-166550595091683/AnsiballZ_command.py'
Jan 26 09:50:55 compute-1 sudo[128146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:56 compute-1 python3.9[128148]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:8d:1d:08:09" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:50:56 compute-1 ovs-vsctl[128149]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:8d:1d:08:09 external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 26 09:50:56 compute-1 sudo[128146]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:56 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:50:56 compute-1 sudo[128174]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:50:56 compute-1 sudo[128174]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:50:56 compute-1 sudo[128174]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:56 compute-1 sudo[128324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfcslzejwavbmefnfypfphqocputaiak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421056.560572-946-158929608404979/AnsiballZ_command.py'
Jan 26 09:50:56 compute-1 sudo[128324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:57.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:57 compute-1 python3.9[128326]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ovs-vsctl show | grep -q "Manager"
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:50:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:57 compute-1 sudo[128324]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:50:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:57.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:50:57 compute-1 ceph-mon[80107]: pgmap v254: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:50:57 compute-1 sudo[128480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uszxfowwqljbeugjhflhbqnahephjhqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421057.4919236-970-197479987447658/AnsiballZ_command.py'
Jan 26 09:50:57 compute-1 sudo[128480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:58 compute-1 python3.9[128482]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:50:58 compute-1 ovs-vsctl[128483]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 26 09:50:58 compute-1 sudo[128480]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:58 compute-1 python3.9[128633]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:50:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:50:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:59.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:50:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:50:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:50:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000054s ======
Jan 26 09:50:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:59.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 26 09:50:59 compute-1 sudo[128786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wielsichqtpdnzyktpamlpgnmjhutzzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421059.1957152-1021-39497680367985/AnsiballZ_file.py'
Jan 26 09:50:59 compute-1 sudo[128786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:50:59 compute-1 python3.9[128788]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:50:59 compute-1 sudo[128786]: pam_unix(sudo:session): session closed for user root
Jan 26 09:50:59 compute-1 ceph-mon[80107]: pgmap v255: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:51:00 compute-1 sudo[128938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eazpsxfozsmiqjiroguhfbckluzwjovv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421059.9999895-1045-138927137392417/AnsiballZ_stat.py'
Jan 26 09:51:00 compute-1 sudo[128938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:00 compute-1 python3.9[128940]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:51:00 compute-1 sudo[128938]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:01 compute-1 sudo[129016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzsgabsoshcabknqdfmslawylldhlozr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421059.9999895-1045-138927137392417/AnsiballZ_file.py'
Jan 26 09:51:01 compute-1 sudo[129016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:01.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:01 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:51:01 compute-1 python3.9[129018]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:51:01 compute-1 sudo[129016]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:01.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:01 compute-1 sudo[129169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slftfeptnritjdqgfokeomgkkjwjrgms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421061.4171689-1045-269086909865119/AnsiballZ_stat.py'
Jan 26 09:51:01 compute-1 sudo[129169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:02 compute-1 ceph-mon[80107]: pgmap v256: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:51:02 compute-1 python3.9[129171]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:51:02 compute-1 sudo[129169]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:02 compute-1 sudo[129247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlyoddmxeqmiaevzettbxithraqdtxuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421061.4171689-1045-269086909865119/AnsiballZ_file.py'
Jan 26 09:51:02 compute-1 sudo[129247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:02 compute-1 python3.9[129249]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:51:02 compute-1 sudo[129247]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:03.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:51:03 compute-1 sudo[129399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-benlfhnvwjrdhzkyjoyhqffltbbbcakt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421062.8418546-1114-195845641449258/AnsiballZ_file.py'
Jan 26 09:51:03 compute-1 sudo[129399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:51:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:03.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:51:03 compute-1 python3.9[129401]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:51:03 compute-1 sudo[129399]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:04 compute-1 ceph-mon[80107]: pgmap v257: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:51:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:51:04 compute-1 sudo[129552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwqspkexntdeehrenzgtstvyewwvsnew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421063.7392185-1138-277103998112519/AnsiballZ_stat.py'
Jan 26 09:51:04 compute-1 sudo[129552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:04 compute-1 python3.9[129554]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:51:04 compute-1 sudo[129552]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:04 compute-1 sudo[129630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-recfybxkxyxxfpemrpkjdlywtywgqzfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421063.7392185-1138-277103998112519/AnsiballZ_file.py'
Jan 26 09:51:04 compute-1 sudo[129630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:04 compute-1 python3.9[129632]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:51:04 compute-1 sudo[129630]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:05.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:05.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:05 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 09:51:05 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Cumulative writes: 2458 writes, 14K keys, 2458 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.06 MB/s
                                           Cumulative WAL: 2458 writes, 2458 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2458 writes, 14K keys, 2458 commit groups, 1.0 writes per commit group, ingest: 37.70 MB, 0.06 MB/s
                                           Interval WAL: 2458 writes, 2458 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    122.2      0.17              0.06         6    0.028       0      0       0.0       0.0
                                             L6      1/0   12.28 MB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   3.0    145.5    128.5      0.49              0.14         5    0.098     21K   2276       0.0       0.0
                                            Sum      1/0   12.28 MB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   4.0    108.2    126.9      0.66              0.19        11    0.060     21K   2276       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   4.0    108.6    127.3      0.65              0.19        10    0.065     21K   2276       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   0.0    145.5    128.5      0.49              0.14         5    0.098     21K   2276       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    124.1      0.17              0.06         5    0.033       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.020, interval 0.020
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.08 GB write, 0.14 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.7 seconds
                                           Interval compaction: 0.08 GB write, 0.14 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.7 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55af2cb209b0#2 capacity: 304.00 MB usage: 2.15 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.00014 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(145,1.95 MB,0.639865%) FilterBlock(11,67.42 KB,0.0216584%) IndexBlock(11,139.77 KB,0.044898%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Jan 26 09:51:05 compute-1 sudo[129783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdmxujgmshoyswhjkuttopankcyoeqka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421065.246134-1174-268024423863485/AnsiballZ_stat.py'
Jan 26 09:51:05 compute-1 sudo[129783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:06 compute-1 python3.9[129785]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:51:06 compute-1 ceph-mon[80107]: pgmap v258: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 596 B/s wr, 1 op/s
Jan 26 09:51:06 compute-1 sudo[129783]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:06 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:51:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:06 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:51:06 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:51:06 compute-1 sudo[129861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mauxwrbksdbagmblorltouhlihcfjait ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421065.246134-1174-268024423863485/AnsiballZ_file.py'
Jan 26 09:51:06 compute-1 sudo[129861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:06 compute-1 python3.9[129863]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:51:06 compute-1 sudo[129861]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:07.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:07 compute-1 sudo[130013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcblpmuhrunpbchnazwprgirdqosoroy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421066.7448614-1210-19455599958181/AnsiballZ_systemd.py'
Jan 26 09:51:07 compute-1 sudo[130013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:07.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:07 compute-1 python3.9[130015]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:51:07 compute-1 systemd[1]: Reloading.
Jan 26 09:51:07 compute-1 systemd-rc-local-generator[130046]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:51:07 compute-1 systemd-sysv-generator[130050]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:51:07 compute-1 sudo[130013]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:08 compute-1 ceph-mon[80107]: pgmap v259: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 596 B/s wr, 1 op/s
Jan 26 09:51:08 compute-1 sudo[130204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjwfguhmqnhzmcauxmgjqcbhaktthclb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421068.1465282-1234-213534113986934/AnsiballZ_stat.py'
Jan 26 09:51:08 compute-1 sudo[130204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:08 compute-1 python3.9[130206]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:51:08 compute-1 sudo[130204]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:09 compute-1 sudo[130282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zaqoavtuddhqsdrhtisdiisjlrghfoiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421068.1465282-1234-213534113986934/AnsiballZ_file.py'
Jan 26 09:51:09 compute-1 sudo[130282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:09.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 09:51:09 compute-1 python3.9[130284]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:51:09 compute-1 sudo[130282]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:09.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:10 compute-1 sudo[130435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vljywxpbkgrxpeqllewynlulotrbovdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421069.4672964-1270-191311954067133/AnsiballZ_stat.py'
Jan 26 09:51:10 compute-1 sudo[130435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:10 compute-1 ceph-mon[80107]: pgmap v260: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 26 09:51:10 compute-1 python3.9[130437]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:51:10 compute-1 sudo[130435]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:10 compute-1 sudo[130513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itjczrewaulsvkqsthklqwqwbcezzshg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421069.4672964-1270-191311954067133/AnsiballZ_file.py'
Jan 26 09:51:10 compute-1 sudo[130513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:10 compute-1 python3.9[130515]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:51:10 compute-1 sudo[130513]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:11.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff4800a7e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:11 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:51:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:51:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:11.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:51:11 compute-1 sshd-session[130516]: Invalid user admin from 104.248.198.71 port 54006
Jan 26 09:51:11 compute-1 sudo[130670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-petodxelulvwsrlwlbwdtzmkilfbqsly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421071.069163-1306-225643430672316/AnsiballZ_systemd.py'
Jan 26 09:51:11 compute-1 sudo[130670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:11 compute-1 sshd-session[130516]: Connection closed by invalid user admin 104.248.198.71 port 54006 [preauth]
Jan 26 09:51:11 compute-1 python3.9[130672]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:51:11 compute-1 systemd[1]: Reloading.
Jan 26 09:51:11 compute-1 systemd-sysv-generator[130703]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:51:11 compute-1 systemd-rc-local-generator[130698]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:51:12 compute-1 ceph-mon[80107]: pgmap v261: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:51:12 compute-1 systemd[1]: Starting Create netns directory...
Jan 26 09:51:12 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 26 09:51:12 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 26 09:51:12 compute-1 systemd[1]: Finished Create netns directory.
Jan 26 09:51:12 compute-1 sudo[130670]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:51:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:13.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:51:13 compute-1 sudo[130866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnwnjtrtvagcghdypdkkjwuelfseggbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421072.7354984-1336-208410190824031/AnsiballZ_file.py'
Jan 26 09:51:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18000d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:13 compute-1 sudo[130866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:13 compute-1 python3.9[130868]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:51:13 compute-1 sudo[130866]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:13.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:14 compute-1 sudo[131019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywhbeghqedgnylsemcqoeqjxiykmsopd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421073.5682874-1360-4511416554229/AnsiballZ_stat.py'
Jan 26 09:51:14 compute-1 sudo[131019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:14 compute-1 ceph-mon[80107]: pgmap v262: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:51:14 compute-1 python3.9[131021]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:51:14 compute-1 sudo[131019]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:14 compute-1 sudo[131142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrvqsawmzhwedzfpxpboepuhysmcpfkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421073.5682874-1360-4511416554229/AnsiballZ_copy.py'
Jan 26 09:51:14 compute-1 sudo[131142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:14 compute-1 python3.9[131144]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421073.5682874-1360-4511416554229/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:51:14 compute-1 sudo[131142]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:51:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:15.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:51:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18001890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:15.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095115 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:51:15 compute-1 sudo[131295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxltgxrlmrjfvswolczxleklwqfzfljp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421075.5490956-1411-185793862040920/AnsiballZ_file.py'
Jan 26 09:51:15 compute-1 sudo[131295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:16 compute-1 python3.9[131297]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:51:16 compute-1 sudo[131295]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:16 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:51:16 compute-1 ceph-mon[80107]: pgmap v263: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:51:16 compute-1 sudo[131374]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:51:16 compute-1 sudo[131374]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:51:16 compute-1 sudo[131374]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:16 compute-1 sudo[131472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtqprumzxhhjidubiygnpgcyipscxtfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421076.367546-1435-94519737046694/AnsiballZ_file.py'
Jan 26 09:51:16 compute-1 sudo[131472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:16 compute-1 python3.9[131474]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:51:16 compute-1 sudo[131472]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:17.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18001890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:51:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:17.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:51:17 compute-1 sudo[131626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlxtpisguyxhycdnzphyyijcmviztsiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421077.1851642-1459-30485736147614/AnsiballZ_stat.py'
Jan 26 09:51:17 compute-1 sudo[131626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:17 compute-1 python3.9[131628]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:51:17 compute-1 sudo[131626]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:18 compute-1 sudo[131750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsqihzdfgxenvwdrwxfxwgewfoipwpib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421077.1851642-1459-30485736147614/AnsiballZ_copy.py'
Jan 26 09:51:18 compute-1 sudo[131750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:18 compute-1 python3.9[131752]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421077.1851642-1459-30485736147614/.source.json _original_basename=.lb_5vyec follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:51:18 compute-1 sudo[131750]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:18 compute-1 ceph-mon[80107]: pgmap v264: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 09:51:18 compute-1 sshd-session[131606]: Connection closed by authenticating user root 178.62.249.31 port 55884 [preauth]
Jan 26 09:51:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:19.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:19 compute-1 python3.9[131902]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:51:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:51:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:19.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:20 compute-1 ceph-mon[80107]: pgmap v265: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 09:51:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:51:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:21.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:51:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff180025a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:21 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:51:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:51:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:21.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:51:21 compute-1 sudo[132325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugippieftpoxdhbbzogvfmwzhycpgngy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421081.0037744-1579-188235372612159/AnsiballZ_container_config_data.py'
Jan 26 09:51:21 compute-1 sudo[132325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:21 compute-1 python3.9[132327]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 26 09:51:21 compute-1 sudo[132325]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:22 compute-1 ceph-mon[80107]: pgmap v266: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 426 B/s wr, 2 op/s
Jan 26 09:51:22 compute-1 sudo[132477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itvqbmvsuockpunsblpeahjujgxjpgyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421082.177843-1612-190029764445563/AnsiballZ_container_config_hash.py'
Jan 26 09:51:22 compute-1 sudo[132477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:22 compute-1 python3.9[132479]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 09:51:22 compute-1 sudo[132477]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:51:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:23.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:51:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:51:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:23.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:51:23 compute-1 sudo[132631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjtsanxorowgnfqmbhllxwiorlpxifxv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769421083.3859823-1642-82133937901786/AnsiballZ_edpm_container_manage.py'
Jan 26 09:51:23 compute-1 sudo[132631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:24 compute-1 python3[132633]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 09:51:24 compute-1 ceph-mon[80107]: pgmap v267: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 26 09:51:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:25.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10002160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:25.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:26 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:51:26 compute-1 ceph-mon[80107]: pgmap v268: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 26 09:51:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:51:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:27.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:51:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:51:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:27.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:51:27 compute-1 ceph-mon[80107]: pgmap v269: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:51:27 compute-1 sshd-session[130673]: Connection closed by 167.94.138.40 port 61342 [preauth]
Jan 26 09:51:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:51:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:29.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:51:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:29 compute-1 podman[132646]: 2026-01-26 09:51:29.311208781 +0000 UTC m=+4.919707012 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Jan 26 09:51:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:51:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:29.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:51:29 compute-1 podman[132766]: 2026-01-26 09:51:29.543831579 +0000 UTC m=+0.071855592 container create 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:51:29 compute-1 podman[132766]: 2026-01-26 09:51:29.514040286 +0000 UTC m=+0.042064289 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Jan 26 09:51:29 compute-1 python3[132633]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Jan 26 09:51:29 compute-1 sudo[132631]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:29 compute-1 ceph-mon[80107]: pgmap v270: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:51:30 compute-1 sudo[132955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blxnkefxtugcsuuibazqgjskcxbuzbeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421090.1083949-1666-162702678017643/AnsiballZ_stat.py'
Jan 26 09:51:30 compute-1 sudo[132955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:30 compute-1 python3.9[132957]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:51:30 compute-1 sudo[132955]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:30 compute-1 sudo[132960]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:51:30 compute-1 sudo[132960]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:51:30 compute-1 sudo[132960]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:30 compute-1 sudo[133009]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 09:51:30 compute-1 sudo[133009]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:51:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:31.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:31 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:51:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:51:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:31.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:51:31 compute-1 sudo[133009]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:31 compute-1 sudo[133192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irhoftcmeibkajsrnkrhzhdksbgrsnwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421091.2422845-1693-120970611997016/AnsiballZ_file.py'
Jan 26 09:51:31 compute-1 sudo[133192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:31 compute-1 python3.9[133194]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:51:31 compute-1 sudo[133192]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:32 compute-1 ceph-mon[80107]: pgmap v271: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 26 09:51:32 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:51:32 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 09:51:32 compute-1 sudo[133268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjlxazhzqenfvriwmrnucghtfgifyhwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421091.2422845-1693-120970611997016/AnsiballZ_stat.py'
Jan 26 09:51:32 compute-1 sudo[133268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:32 compute-1 python3.9[133270]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:51:32 compute-1 sudo[133268]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:51:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:33.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:51:33 compute-1 sudo[133419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khmxmpoimqocovwokmaliikwsngvkxzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421092.5742908-1693-176886658685265/AnsiballZ_copy.py'
Jan 26 09:51:33 compute-1 sudo[133419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:51:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:33.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:51:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:51:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:51:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 09:51:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 09:51:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:51:33 compute-1 python3.9[133421]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769421092.5742908-1693-176886658685265/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:51:33 compute-1 sudo[133419]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:34 compute-1 sudo[133497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hojgovwcmmpacqpanjvigfsrhijxkski ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421092.5742908-1693-176886658685265/AnsiballZ_systemd.py'
Jan 26 09:51:34 compute-1 sudo[133497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:34 compute-1 python3.9[133499]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 09:51:34 compute-1 systemd[1]: Reloading.
Jan 26 09:51:34 compute-1 systemd-rc-local-generator[133528]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:51:34 compute-1 systemd-sysv-generator[133531]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:51:34 compute-1 ceph-mon[80107]: pgmap v272: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:51:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:51:34 compute-1 sudo[133497]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:35.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:35 compute-1 sudo[133610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqtvlvzugxqkgujveqhrraatekpkkppb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421092.5742908-1693-176886658685265/AnsiballZ_systemd.py'
Jan 26 09:51:35 compute-1 sudo[133610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:35 compute-1 sshd-session[133470]: Connection closed by authenticating user root 152.42.136.110 port 40000 [preauth]
Jan 26 09:51:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:35.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:35 compute-1 python3.9[133612]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:51:35 compute-1 systemd[1]: Reloading.
Jan 26 09:51:35 compute-1 systemd-rc-local-generator[133644]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:51:35 compute-1 systemd-sysv-generator[133647]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:51:36 compute-1 systemd[1]: Starting ovn_controller container...
Jan 26 09:51:36 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:51:36 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c8bf517f7993a3b8facf5a0730bef250c97d1a838382ce53c490ea79845b577/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 26 09:51:36 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:51:36 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496.
Jan 26 09:51:36 compute-1 podman[133654]: 2026-01-26 09:51:36.255855719 +0000 UTC m=+0.173310740 container init 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 26 09:51:36 compute-1 ovn_controller[133670]: + sudo -E kolla_set_configs
Jan 26 09:51:36 compute-1 podman[133654]: 2026-01-26 09:51:36.293538747 +0000 UTC m=+0.210993748 container start 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 09:51:36 compute-1 edpm-start-podman-container[133654]: ovn_controller
Jan 26 09:51:36 compute-1 systemd[1]: Created slice User Slice of UID 0.
Jan 26 09:51:36 compute-1 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 26 09:51:36 compute-1 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 26 09:51:36 compute-1 systemd[1]: Starting User Manager for UID 0...
Jan 26 09:51:36 compute-1 systemd[133709]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Jan 26 09:51:36 compute-1 edpm-start-podman-container[133653]: Creating additional drop-in dependency for "ovn_controller" (34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496)
Jan 26 09:51:36 compute-1 podman[133677]: 2026-01-26 09:51:36.390585036 +0000 UTC m=+0.083410708 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 26 09:51:36 compute-1 systemd[1]: 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496-88ead25080fd3b5.service: Main process exited, code=exited, status=1/FAILURE
Jan 26 09:51:36 compute-1 systemd[1]: 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496-88ead25080fd3b5.service: Failed with result 'exit-code'.
Jan 26 09:51:36 compute-1 systemd[1]: Reloading.
Jan 26 09:51:36 compute-1 systemd-rc-local-generator[133754]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:51:36 compute-1 systemd-sysv-generator[133760]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:51:36 compute-1 systemd[133709]: Queued start job for default target Main User Target.
Jan 26 09:51:36 compute-1 systemd[133709]: Created slice User Application Slice.
Jan 26 09:51:36 compute-1 systemd[133709]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 26 09:51:36 compute-1 systemd[133709]: Started Daily Cleanup of User's Temporary Directories.
Jan 26 09:51:36 compute-1 systemd[133709]: Reached target Paths.
Jan 26 09:51:36 compute-1 systemd[133709]: Reached target Timers.
Jan 26 09:51:36 compute-1 systemd[133709]: Starting D-Bus User Message Bus Socket...
Jan 26 09:51:36 compute-1 systemd[133709]: Starting Create User's Volatile Files and Directories...
Jan 26 09:51:36 compute-1 systemd[133709]: Finished Create User's Volatile Files and Directories.
Jan 26 09:51:36 compute-1 systemd[133709]: Listening on D-Bus User Message Bus Socket.
Jan 26 09:51:36 compute-1 systemd[133709]: Reached target Sockets.
Jan 26 09:51:36 compute-1 systemd[133709]: Reached target Basic System.
Jan 26 09:51:36 compute-1 systemd[133709]: Reached target Main User Target.
Jan 26 09:51:36 compute-1 systemd[133709]: Startup finished in 170ms.
Jan 26 09:51:36 compute-1 ceph-mon[80107]: pgmap v273: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 09:51:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095136 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:51:36 compute-1 systemd[1]: Started User Manager for UID 0.
Jan 26 09:51:36 compute-1 systemd[1]: Started ovn_controller container.
Jan 26 09:51:36 compute-1 sudo[133765]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:51:36 compute-1 systemd[1]: Started Session c1 of User root.
Jan 26 09:51:36 compute-1 sudo[133610]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:36 compute-1 sudo[133765]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:51:36 compute-1 sudo[133765]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:36 compute-1 ovn_controller[133670]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 26 09:51:36 compute-1 ovn_controller[133670]: INFO:__main__:Validating config file
Jan 26 09:51:36 compute-1 ovn_controller[133670]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 26 09:51:36 compute-1 ovn_controller[133670]: INFO:__main__:Writing out command to execute
Jan 26 09:51:36 compute-1 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 26 09:51:36 compute-1 ovn_controller[133670]: ++ cat /run_command
Jan 26 09:51:36 compute-1 ovn_controller[133670]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 26 09:51:36 compute-1 ovn_controller[133670]: + ARGS=
Jan 26 09:51:36 compute-1 ovn_controller[133670]: + sudo kolla_copy_cacerts
Jan 26 09:51:36 compute-1 systemd[1]: Started Session c2 of User root.
Jan 26 09:51:36 compute-1 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 26 09:51:36 compute-1 ovn_controller[133670]: + [[ ! -n '' ]]
Jan 26 09:51:36 compute-1 ovn_controller[133670]: + . kolla_extend_start
Jan 26 09:51:36 compute-1 ovn_controller[133670]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 26 09:51:36 compute-1 ovn_controller[133670]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 26 09:51:36 compute-1 ovn_controller[133670]: + umask 0022
Jan 26 09:51:36 compute-1 ovn_controller[133670]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 26 09:51:36 compute-1 NetworkManager[49073]: <info>  [1769421096.9151] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Jan 26 09:51:36 compute-1 NetworkManager[49073]: <info>  [1769421096.9163] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 09:51:36 compute-1 NetworkManager[49073]: <warn>  [1769421096.9169] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 09:51:36 compute-1 NetworkManager[49073]: <info>  [1769421096.9183] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 26 09:51:36 compute-1 NetworkManager[49073]: <info>  [1769421096.9193] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Jan 26 09:51:36 compute-1 NetworkManager[49073]: <info>  [1769421096.9203] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 26 09:51:36 compute-1 kernel: br-int: entered promiscuous mode
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00014|main|INFO|OVS feature set changed, force recompute.
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00022|main|INFO|OVS feature set changed, force recompute.
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 26 09:51:36 compute-1 ovn_controller[133670]: 2026-01-26T09:51:36Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 26 09:51:36 compute-1 NetworkManager[49073]: <info>  [1769421096.9478] manager: (ovn-80993c-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Jan 26 09:51:36 compute-1 NetworkManager[49073]: <info>  [1769421096.9488] manager: (ovn-f90cdf-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Jan 26 09:51:36 compute-1 kernel: genev_sys_6081: entered promiscuous mode
Jan 26 09:51:36 compute-1 NetworkManager[49073]: <info>  [1769421096.9718] device (genev_sys_6081): carrier: link connected
Jan 26 09:51:36 compute-1 NetworkManager[49073]: <info>  [1769421096.9723] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/21)
Jan 26 09:51:36 compute-1 systemd-udevd[133828]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 09:51:36 compute-1 systemd-udevd[133829]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 09:51:37 compute-1 NetworkManager[49073]: <info>  [1769421097.0385] manager: (ovn-8128a1-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Jan 26 09:51:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:51:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:37.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:51:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:37.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:37 compute-1 ceph-mon[80107]: pgmap v274: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:51:37 compute-1 python3.9[133960]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 26 09:51:38 compute-1 sudo[134127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waytzlowvqtzlddwlwisotnfizgomqfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421098.4445858-1828-142697491935149/AnsiballZ_stat.py'
Jan 26 09:51:38 compute-1 sudo[134127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:51:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:39.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:51:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.003000081s ======
Jan 26 09:51:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:39.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000081s
Jan 26 09:51:39 compute-1 python3.9[134129]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:51:39 compute-1 sudo[134131]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 09:51:39 compute-1 sudo[134131]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:51:39 compute-1 sudo[134131]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:39 compute-1 sudo[134127]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:40 compute-1 ceph-mon[80107]: pgmap v275: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:51:40 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:51:40 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:51:40 compute-1 sudo[134276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjljkgeggmlzsqcpoifycikrfgfgtott ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421098.4445858-1828-142697491935149/AnsiballZ_copy.py'
Jan 26 09:51:40 compute-1 sudo[134276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:40 compute-1 python3.9[134278]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421098.4445858-1828-142697491935149/.source.yaml _original_basename=.918c_9n7 follow=False checksum=da37636c5844ad86706b9cbdcceae3b87fc97017 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:51:40 compute-1 sudo[134276]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:41 compute-1 sudo[134428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhzojicqdshzxtpiwxnrpvfaotongtkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421100.6656168-1873-33021668879734/AnsiballZ_command.py'
Jan 26 09:51:41 compute-1 sudo[134428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:41.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:41 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:51:41 compute-1 python3.9[134430]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:51:41 compute-1 ovs-vsctl[134431]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 26 09:51:41 compute-1 sudo[134428]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:51:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:41.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:51:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095141 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:51:42 compute-1 ceph-mon[80107]: pgmap v276: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 09:51:42 compute-1 sudo[134582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnoldyohwzpgrvacabojbkbatdkcxrtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421101.7368667-1897-55538156487944/AnsiballZ_command.py'
Jan 26 09:51:42 compute-1 sudo[134582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:42 compute-1 python3.9[134584]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:51:42 compute-1 ovs-vsctl[134586]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 26 09:51:42 compute-1 sudo[134582]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:43.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:43.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:43 compute-1 sudo[134738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivwixfydjweviqbvqgwpgzxiabeiovmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421103.3858995-1939-205744810876304/AnsiballZ_command.py'
Jan 26 09:51:43 compute-1 sudo[134738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:44 compute-1 python3.9[134740]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:51:44 compute-1 ovs-vsctl[134741]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 26 09:51:44 compute-1 sudo[134738]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:44 compute-1 ceph-mon[80107]: pgmap v277: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:51:44 compute-1 sshd-session[122585]: Connection closed by 192.168.122.30 port 40358
Jan 26 09:51:44 compute-1 sshd-session[122582]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:51:44 compute-1 systemd[1]: session-49.scope: Deactivated successfully.
Jan 26 09:51:44 compute-1 systemd[1]: session-49.scope: Consumed 1min 8.360s CPU time.
Jan 26 09:51:44 compute-1 systemd-logind[783]: Session 49 logged out. Waiting for processes to exit.
Jan 26 09:51:44 compute-1 systemd-logind[783]: Removed session 49.
Jan 26 09:51:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:45.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:45.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:51:46 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:51:46 compute-1 ceph-mon[80107]: pgmap v278: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:51:46 compute-1 systemd[1]: Stopping User Manager for UID 0...
Jan 26 09:51:46 compute-1 systemd[133709]: Activating special unit Exit the Session...
Jan 26 09:51:46 compute-1 systemd[133709]: Stopped target Main User Target.
Jan 26 09:51:46 compute-1 systemd[133709]: Stopped target Basic System.
Jan 26 09:51:46 compute-1 systemd[133709]: Stopped target Paths.
Jan 26 09:51:46 compute-1 systemd[133709]: Stopped target Sockets.
Jan 26 09:51:46 compute-1 systemd[133709]: Stopped target Timers.
Jan 26 09:51:46 compute-1 systemd[133709]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 26 09:51:46 compute-1 systemd[133709]: Closed D-Bus User Message Bus Socket.
Jan 26 09:51:46 compute-1 systemd[133709]: Stopped Create User's Volatile Files and Directories.
Jan 26 09:51:46 compute-1 systemd[133709]: Removed slice User Application Slice.
Jan 26 09:51:46 compute-1 systemd[133709]: Reached target Shutdown.
Jan 26 09:51:46 compute-1 systemd[133709]: Finished Exit the Session.
Jan 26 09:51:46 compute-1 systemd[133709]: Reached target Exit the Session.
Jan 26 09:51:46 compute-1 systemd[1]: user@0.service: Deactivated successfully.
Jan 26 09:51:46 compute-1 systemd[1]: Stopped User Manager for UID 0.
Jan 26 09:51:46 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 26 09:51:47 compute-1 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 26 09:51:47 compute-1 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 26 09:51:47 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 26 09:51:47 compute-1 systemd[1]: Removed slice User Slice of UID 0.
Jan 26 09:51:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:47.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:51:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:47.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:51:47 compute-1 ceph-mon[80107]: pgmap v279: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:51:48 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:51:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:48 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:51:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:48 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:51:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:51:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:49.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:51:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:51:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:49.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:51:49 compute-1 sshd-session[134771]: Accepted publickey for zuul from 192.168.122.30 port 39096 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:51:49 compute-1 systemd-logind[783]: New session 51 of user zuul.
Jan 26 09:51:49 compute-1 systemd[1]: Started Session 51 of User zuul.
Jan 26 09:51:49 compute-1 sshd-session[134771]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:51:49 compute-1 ceph-mon[80107]: pgmap v280: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 425 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:51:50 compute-1 python3.9[134924]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:51:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:51:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:51.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:51:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:51 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:51:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:51.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:51 compute-1 ceph-mon[80107]: pgmap v281: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 766 B/s wr, 2 op/s
Jan 26 09:51:51 compute-1 sudo[135079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gogugnfzdwrdnuxhtgxaxzkqmeadhygy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421111.448024-58-98606864693679/AnsiballZ_file.py'
Jan 26 09:51:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 09:51:51 compute-1 sudo[135079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:52 compute-1 python3.9[135081]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:51:52 compute-1 sudo[135079]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:52 compute-1 sudo[135231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrormpyhzsejsfkqkzyyrcofirfhkgma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421112.3990567-58-183366567935269/AnsiballZ_file.py'
Jan 26 09:51:52 compute-1 sudo[135231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:52 compute-1 python3.9[135233]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:51:53 compute-1 sudo[135231]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:53.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:53.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:53 compute-1 sudo[135384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twoasljirjcrbnkbdigwbvpwqovonlkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421113.194464-58-250846301835158/AnsiballZ_file.py'
Jan 26 09:51:53 compute-1 sudo[135384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:53 compute-1 python3.9[135386]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:51:53 compute-1 sudo[135384]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:53 compute-1 ceph-mon[80107]: pgmap v282: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 766 B/s wr, 2 op/s
Jan 26 09:51:54 compute-1 sudo[135536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sehqrguwwzvtswoikbiewhzugfmudgis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421114.0089111-58-2621232351285/AnsiballZ_file.py'
Jan 26 09:51:54 compute-1 sudo[135536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:54 compute-1 python3.9[135538]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:51:54 compute-1 sudo[135536]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:51:55 compute-1 sudo[135689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctbiczqtyeteezjhhbaafdieljwccfhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421114.7666037-58-115629866745042/AnsiballZ_file.py'
Jan 26 09:51:55 compute-1 sudo[135689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:55.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:55 compute-1 python3.9[135691]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:51:55 compute-1 sudo[135689]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:51:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:55.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:51:56 compute-1 ceph-mon[80107]: pgmap v283: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1022 B/s wr, 3 op/s
Jan 26 09:51:56 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:51:56 compute-1 python3.9[135844]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:51:56 compute-1 sudo[135950]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:51:56 compute-1 sudo[135950]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:51:56 compute-1 sudo[135950]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:57 compute-1 sudo[136019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vawurgkasmxskkrtosrbqklqlpxlknqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421116.5070193-190-33212049511214/AnsiballZ_seboolean.py'
Jan 26 09:51:57 compute-1 sudo[136019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:51:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:51:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:57.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:51:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:57 compute-1 python3.9[136021]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 26 09:51:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:51:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:57.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:51:57 compute-1 sudo[136019]: pam_unix(sudo:session): session closed for user root
Jan 26 09:51:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:58 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:51:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:58 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:51:58 compute-1 ceph-mon[80107]: pgmap v284: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 936 B/s wr, 2 op/s
Jan 26 09:51:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095158 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:51:58 compute-1 python3.9[136172]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:51:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:51:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:59.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:51:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:51:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:51:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:51:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:59.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:51:59 compute-1 python3.9[136294]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421118.1652422-214-224095567021030/.source follow=False _original_basename=haproxy.j2 checksum=1daf285be4abb25cbd7ba376734de140aac9aefe backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:52:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:00 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 09:52:00 compute-1 ceph-mon[80107]: pgmap v285: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 936 B/s wr, 2 op/s
Jan 26 09:52:00 compute-1 python3.9[136446]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:52:01 compute-1 sshd-session[136418]: Invalid user admin from 104.248.198.71 port 39878
Jan 26 09:52:01 compute-1 python3.9[136567]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421119.9583406-259-86040440830638/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:52:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:01.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:01 compute-1 sshd-session[136418]: Connection closed by invalid user admin 104.248.198.71 port 39878 [preauth]
Jan 26 09:52:01 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:52:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:52:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:01.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:52:01 compute-1 ceph-mon[80107]: pgmap v286: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.7 KiB/s rd, 1.7 KiB/s wr, 5 op/s
Jan 26 09:52:01 compute-1 sudo[136718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsoxjdztljvgshdgktqvvztoucpekypu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421121.5656092-310-252330869032612/AnsiballZ_setup.py'
Jan 26 09:52:01 compute-1 sudo[136718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:02 compute-1 python3.9[136720]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 09:52:02 compute-1 sudo[136718]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:02 compute-1 sudo[136802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sozpvubnuufnhaqvaxcvyfkfixlkiitz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421121.5656092-310-252330869032612/AnsiballZ_dnf.py'
Jan 26 09:52:02 compute-1 sudo[136802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:03 compute-1 python3.9[136804]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:52:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:52:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:03.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:52:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:52:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:03.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:52:03 compute-1 ceph-mon[80107]: pgmap v287: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:52:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:52:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095203 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:52:04 compute-1 sudo[136802]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:52:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:05.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:52:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:05 compute-1 sudo[136958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzgmujlewcdhqhjrcjqboatpcsidorqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421124.6599848-346-51566425515421/AnsiballZ_systemd.py'
Jan 26 09:52:05 compute-1 sudo[136958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:52:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:05.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:52:05 compute-1 python3.9[136960]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 09:52:05 compute-1 sshd-session[136807]: Connection closed by authenticating user root 178.62.249.31 port 34486 [preauth]
Jan 26 09:52:05 compute-1 sudo[136958]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:05 compute-1 ceph-mon[80107]: pgmap v288: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.6 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:52:06 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:52:06 compute-1 python3.9[137114]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:52:06 compute-1 ovn_controller[133670]: 2026-01-26T09:52:06Z|00025|memory|INFO|16128 kB peak resident set size after 30.0 seconds
Jan 26 09:52:06 compute-1 ovn_controller[133670]: 2026-01-26T09:52:06Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Jan 26 09:52:06 compute-1 podman[137209]: 2026-01-26 09:52:06.963924556 +0000 UTC m=+0.101715683 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:52:07 compute-1 python3.9[137247]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421125.9850605-370-128080199012704/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:52:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:52:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:07.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:52:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:07.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:07 compute-1 python3.9[137412]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:52:07 compute-1 ceph-mon[80107]: pgmap v289: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 767 B/s wr, 2 op/s
Jan 26 09:52:08 compute-1 python3.9[137533]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421127.3099568-370-245166052308011/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:52:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:09.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:09.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:09 compute-1 python3.9[137684]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:52:09 compute-1 ceph-mon[80107]: pgmap v290: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 767 B/s wr, 2 op/s
Jan 26 09:52:10 compute-1 python3.9[137805]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421129.300026-502-19400771539884/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:52:11 compute-1 python3.9[137955]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:52:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:52:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:11.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:52:11 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:52:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:11.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:11 compute-1 python3.9[138077]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421130.6141384-502-243027666593439/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:52:12 compute-1 ceph-mon[80107]: pgmap v291: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 767 B/s wr, 2 op/s
Jan 26 09:52:12 compute-1 python3.9[138227]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:52:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:13.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:52:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:13.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:52:13 compute-1 sudo[138379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umytgstbgnyrsrfbquwkxlbcvuyhqjti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421133.1473117-616-174702285645730/AnsiballZ_file.py'
Jan 26 09:52:13 compute-1 sudo[138379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:13 compute-1 python3.9[138382]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:52:13 compute-1 sudo[138379]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:14 compute-1 sudo[138532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyipncyyjcrhwsylhwpqptnuifbqxwel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421133.9635937-640-151120376633649/AnsiballZ_stat.py'
Jan 26 09:52:14 compute-1 ceph-mon[80107]: pgmap v292: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Jan 26 09:52:14 compute-1 sudo[138532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:14 compute-1 python3.9[138534]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:52:14 compute-1 sudo[138532]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:14 compute-1 sudo[138610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvaoocjuzyawmdqrvjomxndhcpdhelbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421133.9635937-640-151120376633649/AnsiballZ_file.py'
Jan 26 09:52:14 compute-1 sudo[138610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:15 compute-1 python3.9[138612]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:52:15 compute-1 sudo[138610]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:15.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004020 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:52:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:15.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:52:15 compute-1 sudo[138763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgyogxyvayxjgainriqzgpmdwiyjqbha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421135.2517653-640-204871364755567/AnsiballZ_stat.py'
Jan 26 09:52:15 compute-1 sudo[138763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:15 compute-1 python3.9[138765]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:52:15 compute-1 sudo[138763]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:16 compute-1 sudo[138841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftgzzbewzsiljitangtntlhizxajmmhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421135.2517653-640-204871364755567/AnsiballZ_file.py'
Jan 26 09:52:16 compute-1 sudo[138841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:16 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:52:16 compute-1 python3.9[138843]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:52:16 compute-1 sudo[138841]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:16 compute-1 ceph-mon[80107]: pgmap v293: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 26 09:52:17 compute-1 sudo[139014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cetkijsbxtkcazvsmjmrhtqazzjeelbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421136.6440818-709-21812300039018/AnsiballZ_file.py'
Jan 26 09:52:17 compute-1 sudo[138977]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:52:17 compute-1 sudo[139014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:17 compute-1 sudo[138977]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:52:17 compute-1 sudo[138977]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:52:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:17.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:52:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:17 compute-1 python3.9[139019]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:52:17 compute-1 sudo[139014]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:17.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:17 compute-1 sudo[139173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byopqksscpqqfkrsyyfvvexislezbxwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421137.4750834-733-31584058731652/AnsiballZ_stat.py'
Jan 26 09:52:17 compute-1 sudo[139173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:18 compute-1 python3.9[139175]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:52:18 compute-1 sudo[139173]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:18 compute-1 sudo[139251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iykwptxtbegyhpjubaehhhkfpsfbrsws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421137.4750834-733-31584058731652/AnsiballZ_file.py'
Jan 26 09:52:18 compute-1 sudo[139251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:18 compute-1 ceph-mon[80107]: pgmap v294: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:52:18 compute-1 sshd-session[139021]: Connection closed by authenticating user root 152.42.136.110 port 59916 [preauth]
Jan 26 09:52:18 compute-1 python3.9[139253]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:52:18 compute-1 sudo[139251]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:19 compute-1 sudo[139403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uetrwheaofbsctjtjebtszkgmcqxbjnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421138.8018408-769-267247401946502/AnsiballZ_stat.py'
Jan 26 09:52:19 compute-1 sudo[139403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:52:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:19.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:52:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:19 compute-1 python3.9[139405]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:52:19 compute-1 sudo[139403]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:52:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:19.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:19 compute-1 sudo[139482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-riimqdwralsnksidfxzqrpxcmjapepuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421138.8018408-769-267247401946502/AnsiballZ_file.py'
Jan 26 09:52:19 compute-1 sudo[139482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:19 compute-1 python3.9[139484]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:52:19 compute-1 sudo[139482]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:20 compute-1 sudo[139634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avjjgtcvlktmhrewsjugdtjdzsaqrlfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421140.0399024-805-265021110689403/AnsiballZ_systemd.py'
Jan 26 09:52:20 compute-1 sudo[139634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:20 compute-1 ceph-mon[80107]: pgmap v295: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:52:20 compute-1 python3.9[139636]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:52:20 compute-1 systemd[1]: Reloading.
Jan 26 09:52:20 compute-1 systemd-rc-local-generator[139662]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:52:20 compute-1 systemd-sysv-generator[139666]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:52:21 compute-1 sudo[139634]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:21.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:21 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:52:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:52:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:21.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:52:21 compute-1 ceph-mon[80107]: pgmap v296: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 39 KiB/s rd, 0 B/s wr, 64 op/s
Jan 26 09:52:21 compute-1 sudo[139824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvbpacnhwgcsgtjxhnlkijxocukalbjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421141.3882558-829-90557967328292/AnsiballZ_stat.py'
Jan 26 09:52:21 compute-1 sudo[139824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:21 compute-1 python3.9[139826]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:52:21 compute-1 sudo[139824]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:22 compute-1 sudo[139902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogsmgtvgwfuvaexagiqyjskuxgzeemim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421141.3882558-829-90557967328292/AnsiballZ_file.py'
Jan 26 09:52:22 compute-1 sudo[139902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:22 compute-1 python3.9[139904]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:52:22 compute-1 sudo[139902]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:22 compute-1 sudo[140054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfyptpsucpwpmovdykvnodssvwnzseht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421142.6452587-865-258298411898963/AnsiballZ_stat.py'
Jan 26 09:52:23 compute-1 sudo[140054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:23.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:23 compute-1 python3.9[140056]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:52:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:23 compute-1 sudo[140054]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:23.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:23 compute-1 sudo[140133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvnycudleseoutkaiufheidphhynccvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421142.6452587-865-258298411898963/AnsiballZ_file.py'
Jan 26 09:52:23 compute-1 sudo[140133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:23 compute-1 python3.9[140135]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:52:23 compute-1 sudo[140133]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:23 compute-1 ceph-mon[80107]: pgmap v297: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 39 KiB/s rd, 0 B/s wr, 64 op/s
Jan 26 09:52:24 compute-1 sudo[140285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nigfnviunydbziidjlqhypelraqzcpmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421143.902722-901-112605318378227/AnsiballZ_systemd.py'
Jan 26 09:52:24 compute-1 sudo[140285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:24 compute-1 python3.9[140287]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:52:24 compute-1 systemd[1]: Reloading.
Jan 26 09:52:24 compute-1 systemd-sysv-generator[140318]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:52:24 compute-1 systemd-rc-local-generator[140315]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:52:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:25.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:25.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:25 compute-1 ceph-mon[80107]: pgmap v298: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 59 KiB/s rd, 0 B/s wr, 97 op/s
Jan 26 09:52:25 compute-1 systemd[1]: Starting Create netns directory...
Jan 26 09:52:25 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 26 09:52:25 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 26 09:52:25 compute-1 systemd[1]: Finished Create netns directory.
Jan 26 09:52:25 compute-1 sudo[140285]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:26 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:52:26 compute-1 sudo[140480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxicetsvgyhopzbfembymkqmrisgxwlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421146.2385933-931-86406990435106/AnsiballZ_file.py'
Jan 26 09:52:26 compute-1 sudo[140480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:26 compute-1 python3.9[140482]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:52:26 compute-1 sudo[140480]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:52:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:27.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:52:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:27 compute-1 sudo[140633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twrpeljzsucstnlpfvqgqcfosqeqngzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421147.0174866-955-33889155433279/AnsiballZ_stat.py'
Jan 26 09:52:27 compute-1 sudo[140633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:27.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:27 compute-1 python3.9[140635]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:52:27 compute-1 sudo[140633]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:27 compute-1 ceph-mon[80107]: pgmap v299: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 59 KiB/s rd, 0 B/s wr, 97 op/s
Jan 26 09:52:28 compute-1 sudo[140757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuqolgpzbkscaqcgqjlmawietpmdelnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421147.0174866-955-33889155433279/AnsiballZ_copy.py'
Jan 26 09:52:28 compute-1 sudo[140757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:28 compute-1 python3.9[140759]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421147.0174866-955-33889155433279/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:52:28 compute-1 sudo[140757]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:29 compute-1 sudo[140909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lagesmzpimczfokuspwbpkojydqzivln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421148.7831545-1006-258941261470970/AnsiballZ_file.py'
Jan 26 09:52:29 compute-1 sudo[140909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:52:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:29.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:52:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400040a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:29 compute-1 python3.9[140911]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:52:29 compute-1 sudo[140909]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:29.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:29 compute-1 ceph-mon[80107]: pgmap v300: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 59 KiB/s rd, 0 B/s wr, 97 op/s
Jan 26 09:52:29 compute-1 sudo[141062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmgqfiilrwezqqbmmhecxunypnpguuja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421149.5692039-1030-26948137303499/AnsiballZ_file.py'
Jan 26 09:52:29 compute-1 sudo[141062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:30 compute-1 python3.9[141064]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:52:30 compute-1 sudo[141062]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:30 compute-1 sudo[141214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvgdxmtazgoufqmemlvaznqjwjkzxfzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421150.4518435-1054-13784981376938/AnsiballZ_stat.py'
Jan 26 09:52:30 compute-1 sudo[141214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:30 compute-1 python3.9[141216]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:52:30 compute-1 sudo[141214]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:31.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:31 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:52:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:31 compute-1 sudo[141337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncvseoaoyiwvtakdvbjqexarlsgffwzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421150.4518435-1054-13784981376938/AnsiballZ_copy.py'
Jan 26 09:52:31 compute-1 sudo[141337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:31.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:31 compute-1 python3.9[141339]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421150.4518435-1054-13784981376938/.source.json _original_basename=.mhwyc9ru follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:52:31 compute-1 sudo[141337]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:31 compute-1 ceph-mon[80107]: pgmap v301: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 59 KiB/s rd, 0 B/s wr, 97 op/s
Jan 26 09:52:32 compute-1 python3.9[141490]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:52:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400040c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18000f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:52:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:33.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:52:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:33.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:33 compute-1 ceph-mon[80107]: pgmap v302: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 0 B/s wr, 33 op/s
Jan 26 09:52:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:52:34 compute-1 sudo[141912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwicfqdbgnjeponnmofwlmtvhpshpnjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421154.3702097-1174-182951157027330/AnsiballZ_container_config_data.py'
Jan 26 09:52:34 compute-1 sudo[141912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:35 compute-1 python3.9[141914]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 26 09:52:35 compute-1 sudo[141912]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400040e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:35.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:35.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:35 compute-1 sudo[142065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzfjffbajfdpxkpbkekvhhvykvnawqoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421155.5064337-1207-182222267213056/AnsiballZ_container_config_hash.py'
Jan 26 09:52:35 compute-1 sudo[142065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:35 compute-1 ceph-mon[80107]: pgmap v303: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 0 B/s wr, 33 op/s
Jan 26 09:52:36 compute-1 python3.9[142067]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 09:52:36 compute-1 sudo[142065]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:36 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:52:37 compute-1 sudo[142190]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:52:37 compute-1 sudo[142190]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:52:37 compute-1 sudo[142190]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:37 compute-1 sudo[142250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtbbhtvwopgkctrogbgxqnngryplqohf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769421156.647118-1237-243569583960595/AnsiballZ_edpm_container_manage.py'
Jan 26 09:52:37 compute-1 sudo[142250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:37 compute-1 podman[142209]: 2026-01-26 09:52:37.206785298 +0000 UTC m=+0.096640342 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 26 09:52:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:37.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004100 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:37 compute-1 python3[142259]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 09:52:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:37.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:38 compute-1 ceph-mon[80107]: pgmap v304: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:52:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:52:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:39.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:52:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:39.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:39 compute-1 sudo[142335]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:52:39 compute-1 sudo[142335]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:52:39 compute-1 sudo[142335]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:40 compute-1 sudo[142360]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 09:52:40 compute-1 ceph-mon[80107]: pgmap v305: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:52:40 compute-1 sudo[142360]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:52:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095240 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 2ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:52:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004120 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:41 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:52:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:52:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:41.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:52:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:52:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:41.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:52:42 compute-1 ceph-mon[80107]: pgmap v306: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:52:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:52:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:43.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:52:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:43.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:44 compute-1 ceph-mon[80107]: pgmap v307: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:52:44 compute-1 sudo[142360]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:45.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:45.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:45 compute-1 ceph-mon[80107]: pgmap v308: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 26 09:52:45 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:52:45 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 09:52:45 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:52:45 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:52:45 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 09:52:45 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 09:52:45 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:52:45 compute-1 podman[142284]: 2026-01-26 09:52:45.885891018 +0000 UTC m=+8.346917562 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 26 09:52:46 compute-1 podman[142499]: 2026-01-26 09:52:46.050877847 +0000 UTC m=+0.061087709 container create 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:52:46 compute-1 podman[142499]: 2026-01-26 09:52:46.015444398 +0000 UTC m=+0.025654280 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 26 09:52:46 compute-1 python3[142259]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 26 09:52:46 compute-1 sudo[142250]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:46 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:52:46 compute-1 sudo[142687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ealgswqtqsztjkogehhkdqwyfufmlquo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421166.5415556-1261-219729716662471/AnsiballZ_stat.py'
Jan 26 09:52:46 compute-1 sudo[142687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:47 compute-1 python3.9[142689]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:52:47 compute-1 sudo[142687]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000055s ======
Jan 26 09:52:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:47.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000055s
Jan 26 09:52:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:47.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:47 compute-1 sudo[142842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rolembbcchgluumsxlzkpybcgdepulrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421167.4858842-1288-58333176574715/AnsiballZ_file.py'
Jan 26 09:52:47 compute-1 sudo[142842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:47 compute-1 ceph-mon[80107]: pgmap v309: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:52:47 compute-1 python3.9[142844]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:52:47 compute-1 sudo[142842]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:48 compute-1 sudo[142918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxtnvwrkfweptdmghrxskvymyigykpfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421167.4858842-1288-58333176574715/AnsiballZ_stat.py'
Jan 26 09:52:48 compute-1 sudo[142918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:48 compute-1 python3.9[142920]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:52:48 compute-1 sudo[142918]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:48 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:52:49 compute-1 sudo[143069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyyoqczsypqmrnmxofosxwteexqftyxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421168.5923753-1288-184615256381360/AnsiballZ_copy.py'
Jan 26 09:52:49 compute-1 sudo[143069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:52:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:52:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:49.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:52:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400041a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:49 compute-1 python3.9[143071]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769421168.5923753-1288-184615256381360/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:52:49 compute-1 sudo[143069]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:49.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:49 compute-1 sudo[143147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpseirpmnnlktncvhnmcoyqxvrhvojit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421168.5923753-1288-184615256381360/AnsiballZ_systemd.py'
Jan 26 09:52:49 compute-1 sudo[143147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:49 compute-1 python3.9[143149]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 09:52:49 compute-1 systemd[1]: Reloading.
Jan 26 09:52:50 compute-1 systemd-rc-local-generator[143178]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:52:50 compute-1 systemd-sysv-generator[143181]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:52:50 compute-1 sudo[143147]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:50 compute-1 ceph-mon[80107]: pgmap v310: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:52:50 compute-1 sudo[143259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykmblcumhopqvahxmfgbhqsbvldncgwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421168.5923753-1288-184615256381360/AnsiballZ_systemd.py'
Jan 26 09:52:50 compute-1 sudo[143259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:50 compute-1 python3.9[143261]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:52:51 compute-1 systemd[1]: Reloading.
Jan 26 09:52:51 compute-1 systemd-rc-local-generator[143290]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:52:51 compute-1 systemd-sysv-generator[143294]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:52:51 compute-1 sshd-session[143144]: Connection closed by authenticating user root 178.62.249.31 port 58334 [preauth]
Jan 26 09:52:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:51 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:52:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:51.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:51.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:51 compute-1 systemd[1]: Starting ovn_metadata_agent container...
Jan 26 09:52:51 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:52:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb44cd548efab5b6a01ac0e4aed17369054a220407ab2fc8e605797c1a0444eb/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 26 09:52:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb44cd548efab5b6a01ac0e4aed17369054a220407ab2fc8e605797c1a0444eb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 09:52:51 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98.
Jan 26 09:52:51 compute-1 podman[143305]: 2026-01-26 09:52:51.91638672 +0000 UTC m=+0.307572271 container init 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 26 09:52:51 compute-1 ovn_metadata_agent[143321]: + sudo -E kolla_set_configs
Jan 26 09:52:51 compute-1 podman[143305]: 2026-01-26 09:52:51.949563047 +0000 UTC m=+0.340748588 container start 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:52:51 compute-1 edpm-start-podman-container[143305]: ovn_metadata_agent
Jan 26 09:52:51 compute-1 ovn_metadata_agent[143321]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 26 09:52:51 compute-1 ovn_metadata_agent[143321]: INFO:__main__:Validating config file
Jan 26 09:52:51 compute-1 ovn_metadata_agent[143321]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 26 09:52:51 compute-1 ovn_metadata_agent[143321]: INFO:__main__:Copying service configuration files
Jan 26 09:52:51 compute-1 ovn_metadata_agent[143321]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 26 09:52:51 compute-1 ovn_metadata_agent[143321]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 26 09:52:51 compute-1 ovn_metadata_agent[143321]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 26 09:52:51 compute-1 ovn_metadata_agent[143321]: INFO:__main__:Writing out command to execute
Jan 26 09:52:51 compute-1 ovn_metadata_agent[143321]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 26 09:52:51 compute-1 ovn_metadata_agent[143321]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 26 09:52:51 compute-1 ovn_metadata_agent[143321]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 26 09:52:51 compute-1 ovn_metadata_agent[143321]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 26 09:52:51 compute-1 ovn_metadata_agent[143321]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 26 09:52:51 compute-1 ovn_metadata_agent[143321]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 26 09:52:51 compute-1 ovn_metadata_agent[143321]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 26 09:52:52 compute-1 ovn_metadata_agent[143321]: ++ cat /run_command
Jan 26 09:52:52 compute-1 ovn_metadata_agent[143321]: + CMD=neutron-ovn-metadata-agent
Jan 26 09:52:52 compute-1 ovn_metadata_agent[143321]: + ARGS=
Jan 26 09:52:52 compute-1 ovn_metadata_agent[143321]: + sudo kolla_copy_cacerts
Jan 26 09:52:52 compute-1 ceph-mon[80107]: pgmap v311: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 26 09:52:52 compute-1 ovn_metadata_agent[143321]: Running command: 'neutron-ovn-metadata-agent'
Jan 26 09:52:52 compute-1 ovn_metadata_agent[143321]: + [[ ! -n '' ]]
Jan 26 09:52:52 compute-1 ovn_metadata_agent[143321]: + . kolla_extend_start
Jan 26 09:52:52 compute-1 ovn_metadata_agent[143321]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 26 09:52:52 compute-1 ovn_metadata_agent[143321]: + umask 0022
Jan 26 09:52:52 compute-1 ovn_metadata_agent[143321]: + exec neutron-ovn-metadata-agent
Jan 26 09:52:52 compute-1 podman[143328]: 2026-01-26 09:52:52.055885466 +0000 UTC m=+0.086600175 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:52:52 compute-1 edpm-start-podman-container[143304]: Creating additional drop-in dependency for "ovn_metadata_agent" (6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98)
Jan 26 09:52:52 compute-1 systemd[1]: Reloading.
Jan 26 09:52:52 compute-1 systemd-rc-local-generator[143397]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:52:52 compute-1 systemd-sysv-generator[143400]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:52:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:52 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:52:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:52 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:52:52 compute-1 systemd[1]: Started ovn_metadata_agent container.
Jan 26 09:52:52 compute-1 sshd-session[143299]: Invalid user admin from 104.248.198.71 port 44906
Jan 26 09:52:52 compute-1 sudo[143407]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 09:52:52 compute-1 sudo[143407]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:52:52 compute-1 sudo[143407]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:52 compute-1 sudo[143259]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:52 compute-1 sshd-session[143299]: Connection closed by invalid user admin 104.248.198.71 port 44906 [preauth]
Jan 26 09:52:53 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:52:53 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:52:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400041c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:53.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:53.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.873 143326 INFO neutron.common.config [-] Logging enabled!
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.873 143326 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.874 143326 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.874 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.874 143326 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.874 143326 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.874 143326 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.875 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.875 143326 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.875 143326 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.875 143326 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.875 143326 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.875 143326 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.875 143326 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.875 143326 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.875 143326 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.875 143326 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.876 143326 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.876 143326 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.876 143326 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.876 143326 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.876 143326 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.876 143326 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.876 143326 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.876 143326 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.876 143326 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.876 143326 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.877 143326 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.877 143326 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.877 143326 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.877 143326 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.877 143326 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.877 143326 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.877 143326 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.877 143326 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.877 143326 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.878 143326 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.878 143326 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.878 143326 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.878 143326 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.878 143326 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.878 143326 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.878 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.878 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.879 143326 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.879 143326 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.879 143326 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.879 143326 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.879 143326 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.879 143326 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.879 143326 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.879 143326 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.879 143326 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.879 143326 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.879 143326 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.880 143326 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.880 143326 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.880 143326 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.880 143326 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.880 143326 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.880 143326 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.880 143326 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.880 143326 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.880 143326 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.881 143326 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.881 143326 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.881 143326 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.881 143326 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.881 143326 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.881 143326 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.881 143326 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.881 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.881 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.882 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.882 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.882 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.882 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.882 143326 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.882 143326 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.882 143326 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.882 143326 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.882 143326 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.882 143326 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.883 143326 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.883 143326 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.883 143326 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.883 143326 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.883 143326 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.883 143326 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.883 143326 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.883 143326 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.883 143326 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.883 143326 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.884 143326 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.884 143326 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.884 143326 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.884 143326 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.884 143326 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.884 143326 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.884 143326 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.884 143326 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.884 143326 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.884 143326 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.885 143326 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.885 143326 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.885 143326 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.885 143326 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.885 143326 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.885 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.885 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.885 143326 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.886 143326 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.886 143326 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.886 143326 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.886 143326 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.886 143326 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.886 143326 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.886 143326 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.886 143326 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.887 143326 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.887 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.887 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.887 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.887 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.887 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.887 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.887 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.887 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.888 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.888 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.888 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.888 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.888 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.888 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.888 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.888 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.888 143326 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.889 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.889 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.889 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.889 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.889 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.889 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.889 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.889 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.889 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.889 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.890 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.890 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.890 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.890 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.890 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.890 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.890 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.890 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.890 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.890 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.891 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.891 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.891 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.891 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.891 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.891 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.891 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.892 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.892 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.892 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.892 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.892 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.892 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.892 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.892 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.892 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.893 143326 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.893 143326 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.893 143326 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.893 143326 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.893 143326 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.893 143326 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.893 143326 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.893 143326 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.893 143326 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.893 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.894 143326 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.894 143326 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.894 143326 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.894 143326 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.894 143326 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.894 143326 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.894 143326 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.894 143326 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.894 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.895 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.895 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.895 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.895 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.895 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.895 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.895 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.895 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.895 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.896 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.896 143326 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.896 143326 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.896 143326 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.896 143326 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.896 143326 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.896 143326 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.896 143326 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.897 143326 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.897 143326 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.897 143326 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.897 143326 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.897 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.897 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.897 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.897 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.897 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.898 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.898 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.898 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.898 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.898 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.898 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.898 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.899 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.899 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.899 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.899 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.899 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.899 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.899 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.899 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.900 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.900 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.900 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.900 143326 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.900 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.900 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.900 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.901 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.901 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.901 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.901 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.901 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.901 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.901 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.901 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.901 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.902 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.902 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.902 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.902 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.902 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.902 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.902 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.903 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.903 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.903 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.903 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.903 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.903 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.903 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.904 143326 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.904 143326 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.904 143326 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.904 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.904 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.904 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.904 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.904 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.905 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.905 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.905 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.905 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.905 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.905 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.905 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.905 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.905 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.906 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.906 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.906 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.906 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.906 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.906 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.906 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.906 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.906 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.907 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.907 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.907 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.907 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.907 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.907 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.907 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.907 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.907 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.908 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.908 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.908 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.908 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.908 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.908 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.917 143326 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.917 143326 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.917 143326 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.918 143326 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.918 143326 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.934 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 5f259fb6-5896-4c89-8853-1dd537a2ebf7 (UUID: 5f259fb6-5896-4c89-8853-1dd537a2ebf7) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.958 143326 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.959 143326 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.959 143326 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.959 143326 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.962 143326 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.972 143326 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.979 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '5f259fb6-5896-4c89-8853-1dd537a2ebf7'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], external_ids={}, name=5f259fb6-5896-4c89-8853-1dd537a2ebf7, nb_cfg_timestamp=1769421104940, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.980 143326 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f24b0ca0f70>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.981 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.981 143326 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.981 143326 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.982 143326 INFO oslo_service.service [-] Starting 1 workers
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.988 143326 DEBUG oslo_service.service [-] Started child 143586 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.992 143326 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpsb9r0wpn/privsep.sock']
Jan 26 09:52:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.994 143586 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-245215'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Jan 26 09:52:54 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.032 143586 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 26 09:52:54 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.033 143586 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 26 09:52:54 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.033 143586 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 09:52:54 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.038 143586 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 26 09:52:54 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.046 143586 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 26 09:52:54 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.052 143586 INFO eventlet.wsgi.server [-] (143586) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Jan 26 09:52:54 compute-1 python3.9[143585]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 26 09:52:54 compute-1 ceph-mon[80107]: pgmap v312: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 26 09:52:54 compute-1 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 26 09:52:54 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.714 143326 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 26 09:52:54 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.715 143326 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpsb9r0wpn/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 26 09:52:54 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.568 143615 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 26 09:52:54 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.572 143615 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 26 09:52:54 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.574 143615 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Jan 26 09:52:54 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.575 143615 INFO oslo.privsep.daemon [-] privsep daemon running as pid 143615
Jan 26 09:52:54 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.720 143615 DEBUG oslo.privsep.daemon [-] privsep: reply[38a28a7a-1f01-4f38-98f2-84aa8d5e0750]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:52:55 compute-1 sudo[143745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aebfjrncahueysxdtjbcagmeqvzxyhjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421174.7532618-1423-171386248859386/AnsiballZ_stat.py'
Jan 26 09:52:55 compute-1 sudo[143745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.242 143615 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.242 143615 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.242 143615 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:52:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 09:52:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:55.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:55 compute-1 python3.9[143747]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:52:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:55 compute-1 sudo[143745]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:55.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:55 compute-1 sudo[143871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfacbocsxcxuihgosokixkgittvteusm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421174.7532618-1423-171386248859386/AnsiballZ_copy.py'
Jan 26 09:52:55 compute-1 sudo[143871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.834 143615 DEBUG oslo.privsep.daemon [-] privsep: reply[d008b7ba-10ee-4d98-81f6-74cbf2349b4d]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.839 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, column=external_ids, values=({'neutron:ovn-metadata-id': 'b7f82c07-4108-5b1a-8b28-274a8ea4043b'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.851 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.860 143326 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.860 143326 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.861 143326 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.861 143326 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.861 143326 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.861 143326 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.862 143326 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.862 143326 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.862 143326 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.863 143326 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.863 143326 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.863 143326 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.863 143326 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.864 143326 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.864 143326 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.864 143326 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.865 143326 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.865 143326 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.865 143326 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.866 143326 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.866 143326 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.866 143326 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.867 143326 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.867 143326 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.867 143326 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.868 143326 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.868 143326 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.869 143326 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.869 143326 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.870 143326 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.870 143326 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.871 143326 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.871 143326 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.871 143326 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.872 143326 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.872 143326 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.872 143326 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.873 143326 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.873 143326 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.874 143326 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.874 143326 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.874 143326 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.875 143326 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.875 143326 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.875 143326 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.876 143326 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 python3.9[143873]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421174.7532618-1423-171386248859386/.source.yaml _original_basename=.ag3tdlg7 follow=False checksum=e4cba382ee426a679e5ef46b4fc246a694e7130c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.876 143326 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.876 143326 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.877 143326 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.877 143326 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.878 143326 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.878 143326 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.878 143326 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.879 143326 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.879 143326 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.879 143326 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.880 143326 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.880 143326 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.880 143326 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.880 143326 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.881 143326 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.881 143326 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.881 143326 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.882 143326 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.882 143326 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.882 143326 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.883 143326 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.883 143326 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.883 143326 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.884 143326 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.884 143326 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.884 143326 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.885 143326 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.885 143326 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.885 143326 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.886 143326 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.886 143326 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.887 143326 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.887 143326 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.888 143326 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.888 143326 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.888 143326 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.889 143326 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.889 143326 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.889 143326 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.890 143326 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.890 143326 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.890 143326 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.891 143326 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.891 143326 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.891 143326 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.892 143326 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.892 143326 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.892 143326 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.893 143326 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.893 143326 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.893 143326 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.894 143326 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.894 143326 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.894 143326 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.895 143326 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.895 143326 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.895 143326 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.896 143326 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.896 143326 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.896 143326 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.897 143326 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.897 143326 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.898 143326 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 sudo[143871]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.898 143326 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.899 143326 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.899 143326 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.899 143326 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.900 143326 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.900 143326 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.901 143326 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.901 143326 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.901 143326 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.902 143326 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.902 143326 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.902 143326 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.902 143326 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.902 143326 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.903 143326 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.903 143326 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.904 143326 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.904 143326 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.904 143326 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.904 143326 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.904 143326 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.904 143326 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.904 143326 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.905 143326 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.905 143326 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.905 143326 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.905 143326 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.905 143326 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.905 143326 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.905 143326 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.906 143326 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.906 143326 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.906 143326 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.906 143326 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.906 143326 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.906 143326 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.906 143326 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.907 143326 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.907 143326 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.907 143326 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.907 143326 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.907 143326 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.907 143326 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.907 143326 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.908 143326 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.908 143326 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.908 143326 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.908 143326 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.908 143326 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.908 143326 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.908 143326 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.908 143326 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.909 143326 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.909 143326 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.909 143326 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.909 143326 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.909 143326 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.909 143326 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.909 143326 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.910 143326 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.910 143326 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.910 143326 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.910 143326 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.910 143326 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.910 143326 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.910 143326 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.911 143326 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.911 143326 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.911 143326 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.911 143326 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.911 143326 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.911 143326 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.911 143326 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.912 143326 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.912 143326 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.912 143326 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.912 143326 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.912 143326 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.912 143326 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.913 143326 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.913 143326 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.913 143326 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.913 143326 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.913 143326 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.913 143326 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.913 143326 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.914 143326 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.914 143326 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.914 143326 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.914 143326 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.914 143326 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.914 143326 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.915 143326 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.915 143326 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.915 143326 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.915 143326 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.915 143326 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.916 143326 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.916 143326 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.916 143326 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.916 143326 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.916 143326 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.916 143326 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.916 143326 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.917 143326 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.917 143326 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.917 143326 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.917 143326 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.917 143326 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.917 143326 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.918 143326 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.918 143326 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.918 143326 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.918 143326 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.918 143326 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.918 143326 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.918 143326 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.919 143326 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.919 143326 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.919 143326 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.919 143326 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.919 143326 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.920 143326 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.921 143326 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.921 143326 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.921 143326 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.921 143326 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.921 143326 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.922 143326 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.922 143326 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.922 143326 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.922 143326 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.923 143326 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.923 143326 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.923 143326 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.923 143326 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.923 143326 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.923 143326 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.923 143326 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.924 143326 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.924 143326 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.924 143326 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.924 143326 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.924 143326 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.924 143326 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.924 143326 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.924 143326 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.924 143326 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.925 143326 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.925 143326 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.925 143326 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.925 143326 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.925 143326 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.925 143326 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.925 143326 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.925 143326 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.925 143326 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.926 143326 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.926 143326 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.926 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.926 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.926 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.926 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.926 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.926 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.927 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.927 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.927 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.927 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.927 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.927 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.927 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.927 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.927 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.927 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.928 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.928 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.928 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.928 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.928 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.928 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.928 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.929 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.929 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.929 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.929 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.929 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.929 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.929 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.930 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.930 143326 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.930 143326 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.930 143326 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.930 143326 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 09:52:55 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.930 143326 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 26 09:52:56 compute-1 ceph-mon[80107]: pgmap v313: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 09:52:56 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:52:56 compute-1 sshd-session[134774]: Connection closed by 192.168.122.30 port 39096
Jan 26 09:52:56 compute-1 sshd-session[134771]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:52:56 compute-1 systemd[1]: session-51.scope: Deactivated successfully.
Jan 26 09:52:56 compute-1 systemd[1]: session-51.scope: Consumed 1min 2.975s CPU time.
Jan 26 09:52:56 compute-1 systemd-logind[783]: Session 51 logged out. Waiting for processes to exit.
Jan 26 09:52:56 compute-1 systemd-logind[783]: Removed session 51.
Jan 26 09:52:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:57 compute-1 sudo[143898]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:52:57 compute-1 sudo[143898]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:52:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:57 compute-1 sudo[143898]: pam_unix(sudo:session): session closed for user root
Jan 26 09:52:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:57.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:57.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:52:58 compute-1 ceph-mon[80107]: pgmap v314: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 09:52:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:52:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:59.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:52:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:52:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:52:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:52:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:59.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:00 compute-1 ceph-mon[80107]: pgmap v315: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 09:53:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095300 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:53:00 compute-1 sshd-session[143925]: Connection closed by authenticating user root 152.42.136.110 port 48108 [preauth]
Jan 26 09:53:01 compute-1 sshd-session[143927]: Accepted publickey for zuul from 192.168.122.30 port 53548 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:53:01 compute-1 systemd-logind[783]: New session 52 of user zuul.
Jan 26 09:53:01 compute-1 systemd[1]: Started Session 52 of User zuul.
Jan 26 09:53:01 compute-1 sshd-session[143927]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:53:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:01 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:53:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:53:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:01.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:53:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:53:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:01.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:53:02 compute-1 python3.9[144081]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:53:02 compute-1 ceph-mon[80107]: pgmap v316: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 38 KiB/s rd, 1023 B/s wr, 63 op/s
Jan 26 09:53:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:53:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:03.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:53:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:03.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:53:04 compute-1 sudo[144236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zggpjiozynkmzyzhhhxloxmfhxcsgtuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421183.6556885-58-186002352172461/AnsiballZ_command.py'
Jan 26 09:53:04 compute-1 sudo[144236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:04 compute-1 python3.9[144238]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:53:04 compute-1 ceph-mon[80107]: pgmap v317: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 37 KiB/s rd, 426 B/s wr, 61 op/s
Jan 26 09:53:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:53:04 compute-1 sudo[144236]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:53:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:05.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:53:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:05 compute-1 sudo[144401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpojixgpoyqyufuxsktsuonoqusguanj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421184.7712047-91-270568299223397/AnsiballZ_systemd_service.py'
Jan 26 09:53:05 compute-1 sudo[144401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:53:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:05.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:53:05 compute-1 python3.9[144403]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 09:53:05 compute-1 systemd[1]: Reloading.
Jan 26 09:53:05 compute-1 systemd-sysv-generator[144434]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:53:05 compute-1 systemd-rc-local-generator[144427]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:53:06 compute-1 sudo[144401]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:06 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:53:06 compute-1 ceph-mon[80107]: pgmap v318: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 37 KiB/s rd, 426 B/s wr, 61 op/s
Jan 26 09:53:07 compute-1 python3.9[144589]: ansible-ansible.builtin.service_facts Invoked
Jan 26 09:53:07 compute-1 network[144606]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 09:53:07 compute-1 network[144607]: 'network-scripts' will be removed from distribution in near future.
Jan 26 09:53:07 compute-1 network[144608]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 09:53:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:53:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:07.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:53:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:07.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:53:08 compute-1 podman[144616]: 2026-01-26 09:53:08.08180841 +0000 UTC m=+0.138118368 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 09:53:08 compute-1 ceph-mon[80107]: pgmap v319: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 36 KiB/s rd, 85 B/s wr, 60 op/s
Jan 26 09:53:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:53:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:09.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:53:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:09.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:53:10 compute-1 ceph-mon[80107]: pgmap v320: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 36 KiB/s rd, 85 B/s wr, 60 op/s
Jan 26 09:53:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:11 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:53:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:53:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:11.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:53:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:53:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:11.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:53:12 compute-1 ceph-mon[80107]: pgmap v321: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 36 KiB/s rd, 85 B/s wr, 60 op/s
Jan 26 09:53:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:53:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:13.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:53:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:13.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:53:14 compute-1 ceph-mon[80107]: pgmap v322: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:53:14 compute-1 sudo[144900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvrokrosqxumhnjszxfhfskhxrqoqhno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421194.242121-148-228240126339513/AnsiballZ_systemd_service.py'
Jan 26 09:53:14 compute-1 sudo[144900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:14 compute-1 python3.9[144902]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:53:14 compute-1 sudo[144900]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003e80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:53:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:15.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:53:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:15 compute-1 ceph-mon[80107]: pgmap v323: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:53:15 compute-1 sudo[145054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dndmzrxuledgnorrihrsabxtzvivllbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421195.1297133-148-85138854901274/AnsiballZ_systemd_service.py'
Jan 26 09:53:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:53:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:15.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:15 compute-1 sudo[145054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:15 compute-1 python3.9[145056]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:53:15 compute-1 sudo[145054]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:16 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:53:16 compute-1 sudo[145207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhikroyeubuzuiahirrikydscbmboisv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421196.0672874-148-264304818269757/AnsiballZ_systemd_service.py'
Jan 26 09:53:16 compute-1 sudo[145207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:16 compute-1 python3.9[145209]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:53:16 compute-1 sudo[145207]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:17 compute-1 sudo[145360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jafjranwffbdxnrlubrphpwqaaonvrjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421196.8978748-148-74931049564253/AnsiballZ_systemd_service.py'
Jan 26 09:53:17 compute-1 sudo[145360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:53:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:17.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:53:17 compute-1 sudo[145361]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:53:17 compute-1 sudo[145361]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:53:17 compute-1 sudo[145361]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:53:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:17.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:17 compute-1 python3.9[145367]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:53:17 compute-1 sudo[145360]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:18 compute-1 ceph-mon[80107]: pgmap v324: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:53:18 compute-1 sudo[145539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubbotzpklclhwlrcoetrhmifptmarswc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421198.1707568-148-212008685510734/AnsiballZ_systemd_service.py'
Jan 26 09:53:18 compute-1 sudo[145539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:18 compute-1 python3.9[145541]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:53:18 compute-1 sudo[145539]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:53:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:19.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:53:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:53:19 compute-1 sudo[145693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pflwmgvwoubqpbnjdbwaldenzsretepf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421199.0971525-148-88574151647199/AnsiballZ_systemd_service.py'
Jan 26 09:53:19 compute-1 sudo[145693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:53:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:19.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:53:19 compute-1 python3.9[145695]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:53:19 compute-1 sudo[145693]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:20 compute-1 sudo[145846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksvoehfbgeruvfubztedkqybugzthcjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421200.092263-148-92645468460909/AnsiballZ_systemd_service.py'
Jan 26 09:53:20 compute-1 ceph-mon[80107]: pgmap v325: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:53:20 compute-1 sudo[145846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:20 compute-1 python3.9[145848]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:53:20 compute-1 sudo[145846]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:21 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:53:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:53:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:21.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:53:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:21 compute-1 ceph-mon[80107]: pgmap v326: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 26 09:53:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:53:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:21.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:53:22 compute-1 podman[145875]: 2026-01-26 09:53:22.319861183 +0000 UTC m=+0.083532400 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:53:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003ee0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:53:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:23.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:53:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:23.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:53:23 compute-1 ceph-mon[80107]: pgmap v327: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:53:23 compute-1 sudo[146020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weoqpltxdujvfgylmchzynpozrceldjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421203.3924627-304-15624616368071/AnsiballZ_file.py'
Jan 26 09:53:23 compute-1 sudo[146020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:24 compute-1 python3.9[146022]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:53:24 compute-1 sudo[146020]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:24 compute-1 sudo[146172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnzkatmiaxwkihhzcvxmhwrliwixdjgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421204.3469326-304-34535502645815/AnsiballZ_file.py'
Jan 26 09:53:24 compute-1 sudo[146172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:24 compute-1 python3.9[146174]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:53:24 compute-1 sudo[146172]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003ee0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:53:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:25.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:25 compute-1 sudo[146324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adzrkycbgeqvdmwvwutjrhirrncdnjsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421205.0934625-304-179283894449769/AnsiballZ_file.py'
Jan 26 09:53:25 compute-1 sudo[146324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:53:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:25.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:25 compute-1 python3.9[146327]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:53:25 compute-1 sudo[146324]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:25 compute-1 ceph-mon[80107]: pgmap v328: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:53:26 compute-1 sudo[146477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uybunzhjqdtifxethwjxqxjvlqcynpgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421205.8588016-304-231299537045775/AnsiballZ_file.py'
Jan 26 09:53:26 compute-1 sudo[146477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:26 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:53:26 compute-1 python3.9[146479]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:53:26 compute-1 sudo[146477]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:26 compute-1 sudo[146629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnrffoyumrctwddollqdfpdxuktqtrtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421206.5869162-304-157777450208928/AnsiballZ_file.py'
Jan 26 09:53:26 compute-1 sudo[146629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:27 compute-1 python3.9[146631]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:53:27 compute-1 sudo[146629]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:53:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:27.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:53:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:27.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:27 compute-1 sudo[146782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qimsqyspovrkofkfcptsqkcqtrersmjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421207.3039446-304-208808583586549/AnsiballZ_file.py'
Jan 26 09:53:27 compute-1 sudo[146782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:27 compute-1 python3.9[146784]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:53:27 compute-1 sudo[146782]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:27 compute-1 ceph-mon[80107]: pgmap v329: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:53:28 compute-1 sudo[146934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjtdwbjsyodytlmaefpthlazqqbrqhqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421207.98566-304-123027097425020/AnsiballZ_file.py'
Jan 26 09:53:28 compute-1 sudo[146934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:28 compute-1 python3.9[146936]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:53:28 compute-1 sudo[146934]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:53:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:29.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:29 compute-1 sudo[147086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hinbdkumaqpeyzsfcuivczteqiqzqspz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421209.015559-454-5161452054629/AnsiballZ_file.py'
Jan 26 09:53:29 compute-1 sudo[147086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:29 compute-1 python3.9[147088]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:53:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:53:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:29.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:29 compute-1 sudo[147086]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:29 compute-1 ceph-mon[80107]: pgmap v330: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:53:30 compute-1 sudo[147239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktakpvveiwcneqhjlbtezkgwxmeeppuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421209.757277-454-19139355719869/AnsiballZ_file.py'
Jan 26 09:53:30 compute-1 sudo[147239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:30 compute-1 python3.9[147241]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:53:30 compute-1 sudo[147239]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:30 compute-1 sudo[147391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kalkpekuexxiwmjvbngvqxqqgspwczib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421210.5119417-454-17280033704250/AnsiballZ_file.py'
Jan 26 09:53:30 compute-1 sudo[147391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:31 compute-1 python3.9[147393]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:53:31 compute-1 sudo[147391]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:31 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:53:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:53:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:31.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:53:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:53:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:31.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:31 compute-1 sudo[147544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzisrjdoeufeiulclunotfcxlehltudb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421211.3326068-454-238545379792939/AnsiballZ_file.py'
Jan 26 09:53:31 compute-1 sudo[147544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:31 compute-1 python3.9[147546]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:53:31 compute-1 sudo[147544]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:31 compute-1 ceph-mon[80107]: pgmap v331: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 26 09:53:32 compute-1 sudo[147696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgxqjpvuiqytgfbxdrlplivvspwajmtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421212.117016-454-29068571609498/AnsiballZ_file.py'
Jan 26 09:53:32 compute-1 sudo[147696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:32 compute-1 python3.9[147698]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:53:32 compute-1 sudo[147696]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003f40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000055s ======
Jan 26 09:53:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:33.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000055s
Jan 26 09:53:33 compute-1 sudo[147848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlgujfddergrirlokroriqswgbwnqzdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421212.9667182-454-229171150803816/AnsiballZ_file.py'
Jan 26 09:53:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:33 compute-1 sudo[147848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:53:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:33.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:53:33 compute-1 python3.9[147850]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:53:33 compute-1 sudo[147848]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:33 compute-1 ceph-mon[80107]: pgmap v332: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:53:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:53:34 compute-1 sudo[148001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdwjubcmttrlwkewawdpedqgsogwmqhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421213.7767305-454-932195908084/AnsiballZ_file.py'
Jan 26 09:53:34 compute-1 sudo[148001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:34 compute-1 python3.9[148003]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:53:34 compute-1 sudo[148001]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 09:53:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:35.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 09:53:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:35 compute-1 sudo[148155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kavcdyvmjphjuhhuikvvsjkxcifmyrfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421215.0696979-607-116748347315115/AnsiballZ_command.py'
Jan 26 09:53:35 compute-1 sudo[148155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:53:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:35.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:35 compute-1 python3.9[148157]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:53:35 compute-1 sudo[148155]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:36 compute-1 ceph-mon[80107]: pgmap v333: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:53:36 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:53:36 compute-1 sshd-session[148028]: Connection closed by authenticating user root 178.62.249.31 port 59434 [preauth]
Jan 26 09:53:36 compute-1 python3.9[148310]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 09:53:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:53:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:37.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:53:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:37 compute-1 sudo[148410]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:53:37 compute-1 sudo[148410]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:53:37 compute-1 sudo[148410]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:37 compute-1 sudo[148486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etgztwexkobkhfqpekssjxtaezqakfnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421217.1623616-661-169461440178559/AnsiballZ_systemd_service.py'
Jan 26 09:53:37 compute-1 sudo[148486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:53:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:37.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:53:37 compute-1 python3.9[148488]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 09:53:37 compute-1 systemd[1]: Reloading.
Jan 26 09:53:38 compute-1 ceph-mon[80107]: pgmap v334: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:53:38 compute-1 systemd-sysv-generator[148517]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:53:38 compute-1 systemd-rc-local-generator[148509]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:53:38 compute-1 sudo[148486]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:38 compute-1 podman[148523]: 2026-01-26 09:53:38.405436686 +0000 UTC m=+0.126490997 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:53:38 compute-1 sudo[148699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukllelaerwxairqycuxnjjrnotvrhaom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421218.549469-685-239527700151368/AnsiballZ_command.py'
Jan 26 09:53:38 compute-1 sudo[148699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:39 compute-1 python3.9[148701]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:53:39 compute-1 sudo[148699]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:53:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:39.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:53:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:53:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:39.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:39 compute-1 sudo[148853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uctrionyscxnlkmwovxbjmnwoanxfryx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421219.3723545-685-174099340060781/AnsiballZ_command.py'
Jan 26 09:53:39 compute-1 sudo[148853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:39 compute-1 python3.9[148855]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:53:40 compute-1 ceph-mon[80107]: pgmap v335: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:53:41 compute-1 sudo[148853]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:41 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:53:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:53:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:41.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:53:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:41 compute-1 sudo[149007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiphjvlvmqhiiygdnqckrqsxpcnekqam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421221.1704547-685-83085941281413/AnsiballZ_command.py'
Jan 26 09:53:41 compute-1 sudo[149007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:53:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:41.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:53:41 compute-1 python3.9[149009]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:53:41 compute-1 sudo[149007]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:42 compute-1 sudo[149161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcqqyncmztgplxbzpvpjoreibvnkguar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421221.905451-685-171553742441270/AnsiballZ_command.py'
Jan 26 09:53:42 compute-1 sudo[149161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:42 compute-1 python3.9[149163]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:53:42 compute-1 sudo[149161]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:42 compute-1 ceph-mon[80107]: pgmap v336: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 26 09:53:42 compute-1 sshd-session[149145]: Connection closed by authenticating user root 152.42.136.110 port 40032 [preauth]
Jan 26 09:53:43 compute-1 sudo[149315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifeukfyvywaeblwwrdhzcjypyinkszde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421222.7656336-685-156628180370083/AnsiballZ_command.py'
Jan 26 09:53:43 compute-1 sudo[149315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:43 compute-1 python3.9[149317]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:53:43 compute-1 sudo[149315]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:53:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:43.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:53:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:43.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:53:43 compute-1 ceph-mon[80107]: pgmap v337: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:53:43 compute-1 sudo[149471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbrcnyqedvznzdmmjxzhzncglmewuwbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421223.4710622-685-148451525150146/AnsiballZ_command.py'
Jan 26 09:53:43 compute-1 sudo[149471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:44 compute-1 python3.9[149473]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:53:44 compute-1 sudo[149471]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:44 compute-1 sshd-session[149427]: Invalid user ubuntu from 104.248.198.71 port 42684
Jan 26 09:53:44 compute-1 sshd-session[149427]: Connection closed by invalid user ubuntu 104.248.198.71 port 42684 [preauth]
Jan 26 09:53:44 compute-1 sudo[149624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxdhdlubqpolygqxdditfxuyealtfvuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421224.3615673-685-184459593425068/AnsiballZ_command.py'
Jan 26 09:53:44 compute-1 sudo[149624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:44 compute-1 python3.9[149626]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:53:44 compute-1 sudo[149624]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:53:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:45.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:53:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:45.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:45 compute-1 ceph-mon[80107]: pgmap v338: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:53:46 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:53:46 compute-1 sudo[149780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhtjjjpitlilvefdfggqzrcdwhkcbdbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421225.6170223-847-182109679836683/AnsiballZ_getent.py'
Jan 26 09:53:46 compute-1 sudo[149780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:46 compute-1 python3.9[149782]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 26 09:53:46 compute-1 sudo[149780]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:47 compute-1 sudo[149933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhphhfipsagioxdmjefhjwoncoqdyfbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421226.7215974-871-47109412221516/AnsiballZ_group.py'
Jan 26 09:53:47 compute-1 sudo[149933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:53:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:47.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:47 compute-1 python3.9[149935]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 09:53:47 compute-1 groupadd[149936]: group added to /etc/group: name=libvirt, GID=42473
Jan 26 09:53:47 compute-1 groupadd[149936]: group added to /etc/gshadow: name=libvirt
Jan 26 09:53:47 compute-1 groupadd[149936]: new group: name=libvirt, GID=42473
Jan 26 09:53:47 compute-1 sudo[149933]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:53:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:47.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:47 compute-1 ceph-mon[80107]: pgmap v339: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:53:48 compute-1 sudo[150092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yldzifwdmwytxnrfvuybaqkhcytsintg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421227.75423-895-207456195212744/AnsiballZ_user.py'
Jan 26 09:53:48 compute-1 sudo[150092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:48 compute-1 python3.9[150094]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 09:53:48 compute-1 useradd[150096]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Jan 26 09:53:48 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 09:53:48 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 09:53:48 compute-1 sudo[150092]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:48 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:53:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:53:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:49.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:53:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:49 compute-1 sudo[150254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddewgsnutmplglzpnfjmiwaakdetodje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421229.2015152-928-246509523677634/AnsiballZ_setup.py'
Jan 26 09:53:49 compute-1 sudo[150254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:53:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:49.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:53:49 compute-1 python3.9[150256]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 09:53:49 compute-1 ceph-mon[80107]: pgmap v340: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:53:50 compute-1 sudo[150254]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:50 compute-1 sudo[150338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dypwzjnftjytcazbxusfpjsxooxtshgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421229.2015152-928-246509523677634/AnsiballZ_dnf.py'
Jan 26 09:53:50 compute-1 sudo[150338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:53:50 compute-1 python3.9[150340]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:53:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:51 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:53:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:53:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:51.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:53:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:53:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:51.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:51 compute-1 ceph-mon[80107]: pgmap v341: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 26 09:53:52 compute-1 sudo[150349]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:53:52 compute-1 sudo[150349]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:53:52 compute-1 sudo[150349]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:52 compute-1 sudo[150380]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 09:53:52 compute-1 sudo[150380]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:53:52 compute-1 podman[150373]: 2026-01-26 09:53:52.70576095 +0000 UTC m=+0.065399158 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 09:53:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:53 compute-1 sudo[150380]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:53:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:53.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40002480 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:53:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:53.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:53:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:53:53.911 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:53:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:53:53.912 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:53:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:53:53.912 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:53:54 compute-1 ceph-mon[80107]: pgmap v342: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:53:54 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:53:54 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 09:53:54 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:53:54 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:53:54 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 09:53:54 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 09:53:54 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:53:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:53:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:55.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:53:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:53:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:55.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:56 compute-1 ceph-mon[80107]: pgmap v343: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:53:56 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:53:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40002480 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:53:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:57.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:53:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:57 compute-1 sudo[150483]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:53:57 compute-1 sudo[150483]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:53:57 compute-1 sudo[150483]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:53:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:57.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:53:58 compute-1 ceph-mon[80107]: pgmap v344: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:53:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40002480 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:59 compute-1 sudo[150560]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 09:53:59 compute-1 sudo[150560]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:53:59 compute-1 sudo[150560]: pam_unix(sudo:session): session closed for user root
Jan 26 09:53:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:53:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:59.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:53:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004020 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:53:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:53:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:53:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:59.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:54:00 compute-1 ceph-mon[80107]: pgmap v345: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:54:00 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:54:00 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:54:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:01 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:54:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:01.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40002480 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:54:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:01.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:54:02 compute-1 ceph-mon[80107]: pgmap v346: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 26 09:54:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:03.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:54:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:03.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:54:04 compute-1 ceph-mon[80107]: pgmap v347: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:54:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:54:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:54:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:05.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:54:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:05.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:06 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:54:06 compute-1 ceph-mon[80107]: pgmap v348: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:54:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:54:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:07.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:54:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:07.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:07 compute-1 ceph-mon[80107]: pgmap v349: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:54:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:09 compute-1 podman[150687]: 2026-01-26 09:54:09.35398483 +0000 UTC m=+0.129598515 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 26 09:54:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:09.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:54:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:09.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:54:10 compute-1 ceph-mon[80107]: pgmap v350: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:54:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:11 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:54:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:54:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:11.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:54:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:54:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:11.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:54:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095412 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:54:12 compute-1 ceph-mon[80107]: pgmap v351: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 09:54:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095412 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:54:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:54:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:13.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:54:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:13.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:14 compute-1 ceph-mon[80107]: pgmap v352: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Jan 26 09:54:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:15.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:15.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:16 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:54:16 compute-1 ceph-mon[80107]: pgmap v353: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Jan 26 09:54:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:54:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:17.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:54:17 compute-1 ceph-mon[80107]: pgmap v354: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Jan 26 09:54:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:17.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:17 compute-1 sudo[150721]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:54:17 compute-1 sudo[150721]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:54:17 compute-1 sudo[150721]: pam_unix(sudo:session): session closed for user root
Jan 26 09:54:18 compute-1 kernel: SELinux:  Converting 2778 SID table entries...
Jan 26 09:54:18 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 09:54:18 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 26 09:54:18 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 09:54:18 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 26 09:54:18 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 09:54:18 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 09:54:18 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 09:54:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:54:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:19.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:54:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:19.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:54:20 compute-1 ceph-mon[80107]: pgmap v355: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Jan 26 09:54:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:20 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:54:20 compute-1 sshd-session[150755]: Connection closed by authenticating user root 178.62.249.31 port 54778 [preauth]
Jan 26 09:54:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:21 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:54:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:21.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:21.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:22 compute-1 ceph-mon[80107]: pgmap v356: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 853 B/s rd, 170 B/s wr, 1 op/s
Jan 26 09:54:23 compute-1 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Jan 26 09:54:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff1800bd80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:23 compute-1 podman[150758]: 2026-01-26 09:54:23.325296834 +0000 UTC m=+0.081824685 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 26 09:54:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:54:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:23.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:54:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:54:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:54:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:54:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:23.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:54:24 compute-1 ceph-mon[80107]: pgmap v357: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 170 B/s wr, 1 op/s
Jan 26 09:54:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:54:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff1800bd80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:54:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:25.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:54:25 compute-1 sshd-session[150778]: Connection closed by authenticating user root 152.42.136.110 port 45036 [preauth]
Jan 26 09:54:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:54:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:25.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:54:26 compute-1 ceph-mon[80107]: pgmap v358: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 682 B/s wr, 2 op/s
Jan 26 09:54:26 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:54:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff1800bd80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:54:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:27.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:54:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:27.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:28 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 09:54:28 compute-1 kernel: SELinux:  Converting 2778 SID table entries...
Jan 26 09:54:28 compute-1 ceph-mon[80107]: pgmap v359: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 682 B/s wr, 2 op/s
Jan 26 09:54:28 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 09:54:28 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 26 09:54:28 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 09:54:28 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 26 09:54:28 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 09:54:28 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 09:54:28 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 09:54:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff1800bd80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:29.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:29.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:30 compute-1 ceph-mon[80107]: pgmap v360: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 682 B/s wr, 2 op/s
Jan 26 09:54:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:31 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:54:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:31.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:31 compute-1 ceph-mon[80107]: pgmap v361: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.1 KiB/s rd, 1.1 KiB/s wr, 4 op/s
Jan 26 09:54:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:31.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095432 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:54:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:33.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:33.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:34 compute-1 ceph-mon[80107]: pgmap v362: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 09:54:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:54:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095434 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:54:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:54:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:35.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:54:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:35.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:36 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:54:36 compute-1 sshd-session[150794]: Invalid user ubuntu from 104.248.198.71 port 54212
Jan 26 09:54:36 compute-1 ceph-mon[80107]: pgmap v363: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 09:54:36 compute-1 sshd-session[150794]: Connection closed by invalid user ubuntu 104.248.198.71 port 54212 [preauth]
Jan 26 09:54:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:37.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:37 compute-1 ceph-mon[80107]: pgmap v364: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 426 B/s wr, 2 op/s
Jan 26 09:54:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:54:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:37.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:54:37 compute-1 sudo[150797]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:54:37 compute-1 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 26 09:54:37 compute-1 sudo[150797]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:54:37 compute-1 sudo[150797]: pam_unix(sudo:session): session closed for user root
Jan 26 09:54:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:39.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000053s ======
Jan 26 09:54:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:39.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 26 09:54:39 compute-1 ceph-mon[80107]: pgmap v365: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 426 B/s wr, 2 op/s
Jan 26 09:54:40 compute-1 podman[150823]: 2026-01-26 09:54:40.336841268 +0000 UTC m=+0.105312059 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 26 09:54:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:41 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:54:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:54:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:41.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:54:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:41.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:42 compute-1 ceph-mon[80107]: pgmap v366: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 426 B/s wr, 2 op/s
Jan 26 09:54:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:54:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:43.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:54:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:43.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:44 compute-1 ceph-mon[80107]: pgmap v367: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 26 09:54:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:54:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:45.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:54:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:45.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:46 compute-1 ceph-mon[80107]: pgmap v368: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 26 09:54:46 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:54:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:47.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:47.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:48 compute-1 ceph-mon[80107]: pgmap v369: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:54:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:49.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:54:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:49.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:51 compute-1 ceph-mon[80107]: pgmap v370: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:54:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:51 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:54:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:51.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:54:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:51.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:54:52 compute-1 ceph-mon[80107]: pgmap v371: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 26 09:54:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:54:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:53.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:54:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:54:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:53.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:54:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:54:53.913 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:54:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:54:53.917 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:54:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:54:53.917 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:54:54 compute-1 ceph-mon[80107]: pgmap v372: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:54:54 compute-1 podman[158004]: 2026-01-26 09:54:54.314363271 +0000 UTC m=+0.075960093 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 26 09:54:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004a00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:55.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:55.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:56 compute-1 ceph-mon[80107]: pgmap v373: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 09:54:56 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:54:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:57.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:57.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:54:57 compute-1 sudo[159946]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:54:57 compute-1 sudo[159946]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:54:57 compute-1 sudo[159946]: pam_unix(sudo:session): session closed for user root
Jan 26 09:54:58 compute-1 ceph-mon[80107]: pgmap v374: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:54:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004a20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:54:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 09:54:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:59.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 09:54:59 compute-1 sudo[160810]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:54:59 compute-1 sudo[160810]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:54:59 compute-1 sudo[160810]: pam_unix(sudo:session): session closed for user root
Jan 26 09:54:59 compute-1 sudo[160884]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 26 09:54:59 compute-1 sudo[160884]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:54:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:54:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:54:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:59.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:00 compute-1 ceph-mon[80107]: pgmap v375: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:55:00 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 26 09:55:00 compute-1 podman[161346]: 2026-01-26 09:55:00.292143627 +0000 UTC m=+0.141941575 container exec 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Jan 26 09:55:00 compute-1 podman[161346]: 2026-01-26 09:55:00.401257004 +0000 UTC m=+0.251054942 container exec_died 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Jan 26 09:55:01 compute-1 podman[161916]: 2026-01-26 09:55:01.102163682 +0000 UTC m=+0.083888400 container exec 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 09:55:01 compute-1 podman[161916]: 2026-01-26 09:55:01.11818426 +0000 UTC m=+0.099908938 container exec_died 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 09:55:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:01 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:55:01 compute-1 podman[162153]: 2026-01-26 09:55:01.418498716 +0000 UTC m=+0.061134966 container exec a0e54dfdc80639d6a858a302b9137bdb7eb4ae31fc0c5c95cbf6faee0fa517e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:55:01 compute-1 podman[162153]: 2026-01-26 09:55:01.436236879 +0000 UTC m=+0.078873099 container exec_died a0e54dfdc80639d6a858a302b9137bdb7eb4ae31fc0c5c95cbf6faee0fa517e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=squid, org.label-schema.build-date=20250325)
Jan 26 09:55:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004a60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:55:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:01.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:55:01 compute-1 podman[162363]: 2026-01-26 09:55:01.670998664 +0000 UTC m=+0.054774009 container exec 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 09:55:01 compute-1 podman[162363]: 2026-01-26 09:55:01.684020254 +0000 UTC m=+0.067795589 container exec_died 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 09:55:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:01.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:01 compute-1 podman[162575]: 2026-01-26 09:55:01.914797766 +0000 UTC m=+0.056225538 container exec 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, io.openshift.expose-services=, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, release=1793, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, version=2.2.4, architecture=x86_64, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 26 09:55:01 compute-1 podman[162575]: 2026-01-26 09:55:01.928281798 +0000 UTC m=+0.069709560 container exec_died 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, architecture=x86_64, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., vcs-type=git, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, version=2.2.4)
Jan 26 09:55:01 compute-1 sudo[160884]: pam_unix(sudo:session): session closed for user root
Jan 26 09:55:02 compute-1 sudo[162711]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:55:02 compute-1 sudo[162711]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:55:02 compute-1 sudo[162711]: pam_unix(sudo:session): session closed for user root
Jan 26 09:55:02 compute-1 sudo[162781]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 09:55:02 compute-1 sudo[162781]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:55:02 compute-1 ceph-mon[80107]: pgmap v376: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 26 09:55:02 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:55:02 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:55:02 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:55:02 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:55:02 compute-1 sudo[162781]: pam_unix(sudo:session): session closed for user root
Jan 26 09:55:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004a60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 26 09:55:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 26 09:55:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:55:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 09:55:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:55:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:55:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 09:55:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 09:55:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:55:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:03.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:55:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:03.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:55:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004a60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c002910 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:55:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:05.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:55:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 09:55:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:05.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 09:55:06 compute-1 ceph-mon[80107]: pgmap v377: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:55:06 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:55:06 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:55:06 compute-1 sshd-session[164748]: Connection closed by authenticating user root 178.62.249.31 port 41012 [preauth]
Jan 26 09:55:06 compute-1 ceph-mon[80107]: pgmap v378: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 09:55:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004a60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:55:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:07.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:55:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:07.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:08 compute-1 ceph-mon[80107]: pgmap v379: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:55:08 compute-1 sshd-session[165695]: Connection closed by authenticating user root 152.42.136.110 port 50872 [preauth]
Jan 26 09:55:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004a80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:55:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:09.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:55:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:09.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:10 compute-1 ceph-mon[80107]: pgmap v380: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:55:11 compute-1 sudo[167857]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 09:55:11 compute-1 sudo[167857]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:55:11 compute-1 sudo[167857]: pam_unix(sudo:session): session closed for user root
Jan 26 09:55:11 compute-1 podman[167922]: 2026-01-26 09:55:11.214662462 +0000 UTC m=+0.089169277 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 26 09:55:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003620 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:11 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:55:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:11.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:11.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:12 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:55:12 compute-1 ceph-mon[80107]: pgmap v381: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 26 09:55:12 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:55:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003620 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:55:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:13.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:55:13 compute-1 ceph-mon[80107]: pgmap v382: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:55:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:13.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:55:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:15.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:55:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 09:55:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:15.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 09:55:16 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:55:16 compute-1 ceph-mon[80107]: pgmap v383: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 09:55:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004ae0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004ae0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:17.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:17.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:17 compute-1 ceph-mon[80107]: pgmap v384: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:55:17 compute-1 sudo[168434]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:55:17 compute-1 sudo[168434]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:55:17 compute-1 sudo[168434]: pam_unix(sudo:session): session closed for user root
Jan 26 09:55:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:55:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003620 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:55:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:19.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:55:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:19.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:19 compute-1 ceph-mon[80107]: pgmap v385: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:55:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095521 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:55:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004b00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:21 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:55:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:55:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:21.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:55:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:55:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:21.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:55:22 compute-1 ceph-mon[80107]: pgmap v386: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 26 09:55:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003620 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:23.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:23.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:24 compute-1 ceph-mon[80107]: pgmap v387: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:55:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:25 compute-1 podman[168462]: 2026-01-26 09:55:25.321909157 +0000 UTC m=+0.086081938 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Jan 26 09:55:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003620 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:25.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:55:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:25.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:55:26 compute-1 sshd-session[168483]: Invalid user ubuntu from 104.248.198.71 port 41766
Jan 26 09:55:27 compute-1 sshd-session[168483]: Connection closed by invalid user ubuntu 104.248.198.71 port 41766 [preauth]
Jan 26 09:55:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003620 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:27.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:55:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 09:55:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:27.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 09:55:27 compute-1 ceph-mon[80107]: pgmap v388: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:55:28 compute-1 kernel: SELinux:  Converting 2779 SID table entries...
Jan 26 09:55:28 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 09:55:28 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 26 09:55:28 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 09:55:28 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 26 09:55:28 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 09:55:28 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 09:55:28 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 09:55:28 compute-1 ceph-mon[80107]: pgmap v389: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:55:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:55:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:29.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:55:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:29.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:30 compute-1 ceph-mon[80107]: pgmap v390: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:55:31 compute-1 groupadd[168499]: group added to /etc/group: name=dnsmasq, GID=993
Jan 26 09:55:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:55:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003620 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:31.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:55:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:31.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:55:31 compute-1 groupadd[168499]: group added to /etc/gshadow: name=dnsmasq
Jan 26 09:55:31 compute-1 groupadd[168499]: new group: name=dnsmasq, GID=993
Jan 26 09:55:32 compute-1 useradd[168507]: new user: name=dnsmasq, UID=992, GID=993, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Jan 26 09:55:32 compute-1 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Jan 26 09:55:32 compute-1 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 26 09:55:32 compute-1 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Jan 26 09:55:32 compute-1 ceph-mon[80107]: pgmap v391: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:55:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:55:33 compute-1 groupadd[168520]: group added to /etc/group: name=clevis, GID=992
Jan 26 09:55:33 compute-1 groupadd[168520]: group added to /etc/gshadow: name=clevis
Jan 26 09:55:33 compute-1 groupadd[168520]: new group: name=clevis, GID=992
Jan 26 09:55:33 compute-1 useradd[168527]: new user: name=clevis, UID=991, GID=992, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Jan 26 09:55:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004ba0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:33 compute-1 usermod[168537]: add 'clevis' to group 'tss'
Jan 26 09:55:33 compute-1 usermod[168537]: add 'clevis' to shadow group 'tss'
Jan 26 09:55:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:33.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:55:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:33.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:55:34 compute-1 ceph-mon[80107]: pgmap v392: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:55:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:55:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:55:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:55:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003620 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004bc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:35.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:35.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:35 compute-1 polkitd[43555]: Reloading rules
Jan 26 09:55:35 compute-1 polkitd[43555]: Collecting garbage unconditionally...
Jan 26 09:55:35 compute-1 polkitd[43555]: Loading rules from directory /etc/polkit-1/rules.d
Jan 26 09:55:35 compute-1 polkitd[43555]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 26 09:55:35 compute-1 polkitd[43555]: Finished loading, compiling and executing 3 rules
Jan 26 09:55:35 compute-1 polkitd[43555]: Reloading rules
Jan 26 09:55:35 compute-1 polkitd[43555]: Collecting garbage unconditionally...
Jan 26 09:55:35 compute-1 polkitd[43555]: Loading rules from directory /etc/polkit-1/rules.d
Jan 26 09:55:35 compute-1 polkitd[43555]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 26 09:55:35 compute-1 polkitd[43555]: Finished loading, compiling and executing 3 rules
Jan 26 09:55:36 compute-1 ceph-mon[80107]: pgmap v393: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 767 B/s wr, 2 op/s
Jan 26 09:55:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:37 compute-1 groupadd[168729]: group added to /etc/group: name=ceph, GID=167
Jan 26 09:55:37 compute-1 groupadd[168729]: group added to /etc/gshadow: name=ceph
Jan 26 09:55:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003620 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:37 compute-1 groupadd[168729]: new group: name=ceph, GID=167
Jan 26 09:55:37 compute-1 useradd[168735]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Jan 26 09:55:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004be0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:37.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:55:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:37.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:38 compute-1 sudo[168743]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:55:38 compute-1 sudo[168743]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:55:38 compute-1 sudo[168743]: pam_unix(sudo:session): session closed for user root
Jan 26 09:55:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:38 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 09:55:38 compute-1 ceph-mon[80107]: pgmap v394: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 767 B/s wr, 2 op/s
Jan 26 09:55:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff480089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003620 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:55:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:39.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:55:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:39.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:39 compute-1 ceph-mon[80107]: pgmap v395: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 767 B/s wr, 2 op/s
Jan 26 09:55:40 compute-1 sshd[1006]: Received signal 15; terminating.
Jan 26 09:55:40 compute-1 systemd[1]: Stopping OpenSSH server daemon...
Jan 26 09:55:40 compute-1 systemd[1]: sshd.service: Deactivated successfully.
Jan 26 09:55:40 compute-1 systemd[1]: Stopped OpenSSH server daemon.
Jan 26 09:55:40 compute-1 systemd[1]: sshd.service: Consumed 7.777s CPU time, read 32.0K from disk, written 112.0K to disk.
Jan 26 09:55:40 compute-1 systemd[1]: Stopped target sshd-keygen.target.
Jan 26 09:55:40 compute-1 systemd[1]: Stopping sshd-keygen.target...
Jan 26 09:55:40 compute-1 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 09:55:40 compute-1 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 09:55:40 compute-1 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 09:55:40 compute-1 systemd[1]: Reached target sshd-keygen.target.
Jan 26 09:55:40 compute-1 systemd[1]: Starting OpenSSH server daemon...
Jan 26 09:55:40 compute-1 sshd[169407]: Server listening on 0.0.0.0 port 22.
Jan 26 09:55:40 compute-1 sshd[169407]: Server listening on :: port 22.
Jan 26 09:55:40 compute-1 systemd[1]: Started OpenSSH server daemon.
Jan 26 09:55:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:41 compute-1 podman[169451]: 2026-01-26 09:55:41.416013642 +0000 UTC m=+0.140604830 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 26 09:55:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:41.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:41.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:42 compute-1 ceph-mon[80107]: pgmap v396: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:55:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:55:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095543 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:55:43 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 09:55:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:43 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 26 09:55:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004c20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:43 compute-1 systemd[1]: Reloading.
Jan 26 09:55:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004c20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:43 compute-1 systemd-rc-local-generator[169690]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:55:43 compute-1 systemd-sysv-generator[169696]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:55:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:43.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:43 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 09:55:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:43.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004c20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:55:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:45.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:55:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:55:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:45.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:55:46 compute-1 ceph-mon[80107]: pgmap v397: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Jan 26 09:55:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004c20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:47 compute-1 ceph-mon[80107]: pgmap v398: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 26 09:55:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:47.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:55:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:47.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:48 compute-1 sudo[150338]: pam_unix(sudo:session): session closed for user root
Jan 26 09:55:48 compute-1 ceph-mon[80107]: pgmap v399: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 255 B/s wr, 1 op/s
Jan 26 09:55:49 compute-1 sudo[174917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uijjqvlqkyusnxfgajubhowpewcxytnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421348.441472-964-107574994044253/AnsiballZ_systemd.py'
Jan 26 09:55:49 compute-1 sudo[174917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:55:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004c20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:49 compute-1 python3.9[174938]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 09:55:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004c20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:49 compute-1 systemd[1]: Reloading.
Jan 26 09:55:49 compute-1 systemd-rc-local-generator[175394]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:55:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:49 compute-1 systemd-sysv-generator[175398]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:55:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:55:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:49.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:55:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:55:49 compute-1 ceph-mon[80107]: pgmap v400: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 255 B/s wr, 1 op/s
Jan 26 09:55:49 compute-1 sudo[174917]: pam_unix(sudo:session): session closed for user root
Jan 26 09:55:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:49.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:50 compute-1 sudo[176158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttjazccjmnicaletwnriinntxmzjuzaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421349.8824441-964-192082643976102/AnsiballZ_systemd.py'
Jan 26 09:55:50 compute-1 sudo[176158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:55:50 compute-1 python3.9[176180]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 09:55:50 compute-1 systemd[1]: Reloading.
Jan 26 09:55:50 compute-1 systemd-sysv-generator[176686]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:55:50 compute-1 systemd-rc-local-generator[176682]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:55:50 compute-1 sudo[176158]: pam_unix(sudo:session): session closed for user root
Jan 26 09:55:51 compute-1 sudo[177522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dacdzrnpdkcxativnczkegmplhghrfok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421351.012964-964-226266497303878/AnsiballZ_systemd.py'
Jan 26 09:55:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:51 compute-1 sudo[177522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:55:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004c20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:51.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:51 compute-1 python3.9[177542]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 09:55:51 compute-1 systemd[1]: Reloading.
Jan 26 09:55:51 compute-1 systemd-sysv-generator[177991]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:55:51 compute-1 systemd-rc-local-generator[177985]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:55:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:51.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:51 compute-1 sshd-session[176616]: Connection closed by authenticating user root 178.62.249.31 port 46428 [preauth]
Jan 26 09:55:51 compute-1 ceph-mon[80107]: pgmap v401: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 255 B/s wr, 1 op/s
Jan 26 09:55:52 compute-1 sudo[177522]: pam_unix(sudo:session): session closed for user root
Jan 26 09:55:52 compute-1 sshd-session[177080]: Connection closed by authenticating user root 152.42.136.110 port 49680 [preauth]
Jan 26 09:55:52 compute-1 sudo[178752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvdwuekefvesnlmazdkhueybfstcvaqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421352.1611872-964-262669105361067/AnsiballZ_systemd.py'
Jan 26 09:55:52 compute-1 sudo[178752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:55:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:55:52 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 09:55:52 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 09:55:52 compute-1 systemd[1]: man-db-cache-update.service: Consumed 11.852s CPU time.
Jan 26 09:55:52 compute-1 systemd[1]: run-r30ecc3fc361a49d4ae0e2a05cdb3fce2.service: Deactivated successfully.
Jan 26 09:55:52 compute-1 python3.9[178773]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 09:55:52 compute-1 systemd[1]: Reloading.
Jan 26 09:55:52 compute-1 systemd-sysv-generator[178837]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:55:52 compute-1 systemd-rc-local-generator[178833]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:55:53 compute-1 sudo[178752]: pam_unix(sudo:session): session closed for user root
Jan 26 09:55:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004c20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:55:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:53.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:55:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:53.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:55:53.915 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:55:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:55:53.916 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:55:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:55:53.917 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:55:54 compute-1 ceph-mon[80107]: pgmap v402: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 26 09:55:55 compute-1 sudo[178992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whklugmsacdihydoyxcqdttylwagucbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421354.779485-1051-203276891241880/AnsiballZ_systemd.py'
Jan 26 09:55:55 compute-1 sudo[178992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:55:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003620 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004c20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:55 compute-1 python3.9[178994]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 09:55:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:55.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:55 compute-1 podman[178997]: 2026-01-26 09:55:55.616065417 +0000 UTC m=+0.079957208 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 09:55:55 compute-1 systemd[1]: Reloading.
Jan 26 09:55:55 compute-1 systemd-sysv-generator[179047]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:55:55 compute-1 systemd-rc-local-generator[179042]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:55:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:55.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:56 compute-1 sudo[178992]: pam_unix(sudo:session): session closed for user root
Jan 26 09:55:56 compute-1 ceph-mon[80107]: pgmap v403: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 26 09:55:56 compute-1 sudo[179202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tplukekvnznsgslgjkjosknjhmkowkzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421356.2298563-1051-151515901189632/AnsiballZ_systemd.py'
Jan 26 09:55:56 compute-1 sudo[179202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:55:56 compute-1 python3.9[179204]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 09:55:56 compute-1 systemd[1]: Reloading.
Jan 26 09:55:57 compute-1 systemd-rc-local-generator[179232]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:55:57 compute-1 systemd-sysv-generator[179238]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:55:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:57 compute-1 sudo[179202]: pam_unix(sudo:session): session closed for user root
Jan 26 09:55:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003640 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:57.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:55:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:57.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:57 compute-1 sudo[179393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrfqoybnfybroqkaterpwjkxysyspoby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421357.5429597-1051-102974373665065/AnsiballZ_systemd.py'
Jan 26 09:55:57 compute-1 sudo[179393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:55:58 compute-1 ceph-mon[80107]: pgmap v404: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:55:58 compute-1 sudo[179396]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:55:58 compute-1 sudo[179396]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:55:58 compute-1 sudo[179396]: pam_unix(sudo:session): session closed for user root
Jan 26 09:55:58 compute-1 python3.9[179395]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 09:55:58 compute-1 systemd[1]: Reloading.
Jan 26 09:55:58 compute-1 systemd-rc-local-generator[179454]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:55:58 compute-1 systemd-sysv-generator[179458]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:55:58 compute-1 sudo[179393]: pam_unix(sudo:session): session closed for user root
Jan 26 09:55:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:59 compute-1 sudo[179609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtbfezficzoprviacvytsycoaypkkyxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421358.915781-1051-274666408047965/AnsiballZ_systemd.py'
Jan 26 09:55:59 compute-1 sudo[179609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:55:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:55:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:59.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:55:59 compute-1 python3.9[179611]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 09:55:59 compute-1 sudo[179609]: pam_unix(sudo:session): session closed for user root
Jan 26 09:55:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:55:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:55:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:59.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:00 compute-1 ceph-mon[80107]: pgmap v405: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:56:00 compute-1 sudo[179765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xebuyxmkxudvshashyufsnniiukwzbdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421359.9800408-1051-269821404840732/AnsiballZ_systemd.py'
Jan 26 09:56:00 compute-1 sudo[179765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:00 compute-1 python3.9[179767]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 09:56:00 compute-1 systemd[1]: Reloading.
Jan 26 09:56:00 compute-1 systemd-rc-local-generator[179801]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:56:00 compute-1 systemd-sysv-generator[179805]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:56:01 compute-1 sudo[179765]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:56:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:01.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:56:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:56:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:01.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:56:02 compute-1 sudo[179957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxdcztjgpmkthgjavcnlqhcysliuzlgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421361.8586154-1159-6921329442202/AnsiballZ_systemd.py'
Jan 26 09:56:02 compute-1 sudo[179957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:02 compute-1 ceph-mon[80107]: pgmap v406: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 26 09:56:02 compute-1 python3.9[179959]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 09:56:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:56:02 compute-1 systemd[1]: Reloading.
Jan 26 09:56:02 compute-1 systemd-sysv-generator[179996]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:56:02 compute-1 systemd-rc-local-generator[179992]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:56:02 compute-1 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 26 09:56:02 compute-1 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 26 09:56:03 compute-1 sudo[179957]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:03.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:03 compute-1 sudo[180151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrwylausqyzwfewxurnztqwvsfqiyqgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421363.3209777-1183-96383287725728/AnsiballZ_systemd.py'
Jan 26 09:56:03 compute-1 sudo[180151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:56:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:03.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:56:03 compute-1 python3.9[180153]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 09:56:04 compute-1 sudo[180151]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:04 compute-1 ceph-mon[80107]: pgmap v407: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:56:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:56:04 compute-1 sudo[180306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddetsvpgdvuzfowayntdtrbzjloewbjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421364.2307878-1183-154850282958136/AnsiballZ_systemd.py'
Jan 26 09:56:04 compute-1 sudo[180306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:04 compute-1 python3.9[180308]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 09:56:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:05.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:56:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:05.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:56:06 compute-1 sudo[180306]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:06 compute-1 ceph-mon[80107]: pgmap v408: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 09:56:06 compute-1 sudo[180462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puzkjkfqhcdmltqpqofesnbccaxlubwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421366.189594-1183-238210271070331/AnsiballZ_systemd.py'
Jan 26 09:56:06 compute-1 sudo[180462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:06 compute-1 python3.9[180464]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 09:56:06 compute-1 sudo[180462]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:07 compute-1 sudo[180617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxvmkmhvsrjxdvywocnbpvdrrjahkuyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421367.0549395-1183-147173277499350/AnsiballZ_systemd.py'
Jan 26 09:56:07 compute-1 sudo[180617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:56:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:07.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:56:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:56:07 compute-1 python3.9[180619]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 09:56:07 compute-1 sudo[180617]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:07.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:08 compute-1 sudo[180773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbhveqssiddavplcwcarxepbsiznkkle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421367.9663682-1183-68952103721044/AnsiballZ_systemd.py'
Jan 26 09:56:08 compute-1 sudo[180773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:08 compute-1 ceph-mon[80107]: pgmap v409: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:56:08 compute-1 python3.9[180775]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 09:56:08 compute-1 sudo[180773]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:09 compute-1 sudo[180928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fempvxqcotzjbmdcnzbnpeyqtmrokevt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421368.8918178-1183-208147221061272/AnsiballZ_systemd.py'
Jan 26 09:56:09 compute-1 sudo[180928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:09 compute-1 python3.9[180930]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 09:56:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:56:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:09.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:56:09 compute-1 sudo[180928]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:09.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:10 compute-1 sudo[181084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nubdchlxnbypguqfjmrspkjbseduudzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421369.7518866-1183-38258512973147/AnsiballZ_systemd.py'
Jan 26 09:56:10 compute-1 sudo[181084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:10 compute-1 python3.9[181086]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 09:56:10 compute-1 sudo[181084]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:10 compute-1 ceph-mon[80107]: pgmap v410: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:56:10 compute-1 sudo[181239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emrctawmbrnvbffoxktullonckuignur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421370.6290622-1183-47944276126907/AnsiballZ_systemd.py'
Jan 26 09:56:10 compute-1 sudo[181239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095611 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:56:11 compute-1 python3.9[181241]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 09:56:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:11 compute-1 sudo[181242]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:56:11 compute-1 sudo[181242]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:56:11 compute-1 sudo[181242]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:11 compute-1 sudo[181239]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:11 compute-1 sudo[181270]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 09:56:11 compute-1 sudo[181270]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:56:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 09:56:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:11.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 09:56:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:56:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:11.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:56:11 compute-1 sudo[181480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xumltsoihfvwiprtikpuchktfkmmhcht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421371.5602014-1183-13814618894565/AnsiballZ_systemd.py'
Jan 26 09:56:11 compute-1 sudo[181480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:11 compute-1 podman[181436]: 2026-01-26 09:56:11.930674435 +0000 UTC m=+0.094143108 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 09:56:11 compute-1 sudo[181270]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:12 compute-1 python3.9[181501]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 09:56:12 compute-1 sudo[181480]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:12 compute-1 ceph-mon[80107]: pgmap v411: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 26 09:56:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:56:12 compute-1 sudo[181656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oymiqibgoayjsuskanyspeqekinthhwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421372.457772-1183-218712039241483/AnsiballZ_systemd.py'
Jan 26 09:56:12 compute-1 sudo[181656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:13 compute-1 python3.9[181658]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 09:56:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003680 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:56:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:13.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:56:13 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:56:13 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 09:56:13 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:56:13 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:56:13 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 09:56:13 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 09:56:13 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:56:13 compute-1 ceph-mon[80107]: pgmap v412: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:56:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:56:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:13.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:56:14 compute-1 sudo[181656]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:14 compute-1 sudo[181812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqvunhgykjarknywjnecllbyjchibkvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421374.3156142-1183-161711134998500/AnsiballZ_systemd.py'
Jan 26 09:56:14 compute-1 sudo[181812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:15 compute-1 python3.9[181814]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 09:56:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c0036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:56:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:15.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:56:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:15.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:15 compute-1 ceph-mon[80107]: pgmap v413: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:56:16 compute-1 sudo[181812]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:16 compute-1 sudo[181968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajpdrtmacbezbwakbhycxmsrqnywyzcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421376.3266726-1183-255472085081736/AnsiballZ_systemd.py'
Jan 26 09:56:16 compute-1 sudo[181968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:17 compute-1 python3.9[181970]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 09:56:17 compute-1 sudo[181968]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:17.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:17 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:56:17 compute-1 sudo[182124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrtjviyzcbhonyusfrldgeujakrmmral ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421377.3272953-1183-59298471223465/AnsiballZ_systemd.py'
Jan 26 09:56:17 compute-1 sudo[182124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:56:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:17.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:56:17 compute-1 ceph-mon[80107]: pgmap v414: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:56:18 compute-1 python3.9[182127]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 09:56:18 compute-1 sudo[182131]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 09:56:18 compute-1 sudo[182131]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:56:18 compute-1 sudo[182131]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:18 compute-1 sudo[182124]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:18 compute-1 sudo[182170]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:56:18 compute-1 sudo[182170]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:56:18 compute-1 sudo[182170]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:18 compute-1 sshd-session[182126]: Invalid user ubuntu from 104.248.198.71 port 39012
Jan 26 09:56:18 compute-1 sshd-session[182126]: Connection closed by invalid user ubuntu 104.248.198.71 port 39012 [preauth]
Jan 26 09:56:18 compute-1 sudo[182331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkmhgfruzwtvwxrtrwsmqaowjlqnycve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421378.303931-1183-233635839625365/AnsiballZ_systemd.py'
Jan 26 09:56:18 compute-1 sudo[182331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:18 compute-1 python3.9[182333]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 09:56:18 compute-1 sudo[182331]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:56:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:56:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:56:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c0036c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:56:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:56:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:19.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:56:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:19.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:19 compute-1 sudo[182487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iatheuxhasupfadkkmgigwvgshktcqjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421379.576979-1489-56959701473118/AnsiballZ_file.py'
Jan 26 09:56:19 compute-1 sudo[182487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:20 compute-1 ceph-mon[80107]: pgmap v415: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:56:20 compute-1 python3.9[182489]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:56:20 compute-1 sudo[182487]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:20 compute-1 sudo[182639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyypjumakskofyqyvfjptfmqdoorykjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421380.3709583-1489-104088694761441/AnsiballZ_file.py'
Jan 26 09:56:20 compute-1 sudo[182639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:20 compute-1 python3.9[182641]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:56:20 compute-1 sudo[182639]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:21 compute-1 sudo[182793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukwgyeykbagvkwmfycqmhsbjkyggwnqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421381.075767-1489-54893283807927/AnsiballZ_file.py'
Jan 26 09:56:21 compute-1 sudo[182793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c0036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c0036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:21 compute-1 python3.9[182795]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:56:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:21.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:21 compute-1 sudo[182793]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:21.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:22 compute-1 sudo[182946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjymgcklneslqhsiujdrbsysoceobqmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421381.7593963-1489-81964304417596/AnsiballZ_file.py'
Jan 26 09:56:22 compute-1 sudo[182946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:22 compute-1 ceph-mon[80107]: pgmap v416: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 26 09:56:22 compute-1 python3.9[182948]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:56:22 compute-1 sudo[182946]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:22 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:56:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:22 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:56:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:22 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:56:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:56:22 compute-1 sudo[183098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfmphgzyhkldgjytvhokvbshnsqxgmbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421382.378438-1489-67627217055079/AnsiballZ_file.py'
Jan 26 09:56:22 compute-1 sudo[183098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:22 compute-1 python3.9[183100]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:56:22 compute-1 sudo[183098]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.080030) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421383080085, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 4703, "num_deletes": 502, "total_data_size": 12917563, "memory_usage": 13092048, "flush_reason": "Manual Compaction"}
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421383146958, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 8366266, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13208, "largest_seqno": 17906, "table_properties": {"data_size": 8348437, "index_size": 12083, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 4677, "raw_key_size": 36561, "raw_average_key_size": 19, "raw_value_size": 8311857, "raw_average_value_size": 4475, "num_data_blocks": 528, "num_entries": 1857, "num_filter_entries": 1857, "num_deletions": 502, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420938, "oldest_key_time": 1769420938, "file_creation_time": 1769421383, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 66955 microseconds, and 27408 cpu microseconds.
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.146997) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 8366266 bytes OK
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.147014) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.149341) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.149358) EVENT_LOG_v1 {"time_micros": 1769421383149354, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.149396) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 12897232, prev total WAL file size 12897232, number of live WAL files 2.
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.153141) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(8170KB)], [27(12MB)]
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421383153237, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 21245174, "oldest_snapshot_seqno": -1}
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 5067 keys, 15580489 bytes, temperature: kUnknown
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421383277106, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 15580489, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15541645, "index_size": 25102, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12677, "raw_key_size": 126680, "raw_average_key_size": 25, "raw_value_size": 15444863, "raw_average_value_size": 3048, "num_data_blocks": 1057, "num_entries": 5067, "num_filter_entries": 5067, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769421383, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.277520) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 15580489 bytes
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.279084) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 171.4 rd, 125.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(8.0, 12.3 +0.0 blob) out(14.9 +0.0 blob), read-write-amplify(4.4) write-amplify(1.9) OK, records in: 6089, records dropped: 1022 output_compression: NoCompression
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.279113) EVENT_LOG_v1 {"time_micros": 1769421383279100, "job": 14, "event": "compaction_finished", "compaction_time_micros": 123986, "compaction_time_cpu_micros": 33416, "output_level": 6, "num_output_files": 1, "total_output_size": 15580489, "num_input_records": 6089, "num_output_records": 5067, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421383281931, "job": 14, "event": "table_file_deletion", "file_number": 29}
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421383286615, "job": 14, "event": "table_file_deletion", "file_number": 27}
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.152969) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.286669) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.286676) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.286679) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.286683) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:56:23 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.286686) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:56:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:23 compute-1 sudo[183251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcpxedbdkuosmdgvocpedvanccodfpwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421383.041298-1489-11914445483834/AnsiballZ_file.py'
Jan 26 09:56:23 compute-1 sudo[183251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:56:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:23.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:56:23 compute-1 python3.9[183253]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 09:56:23 compute-1 sudo[183251]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:23.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:24 compute-1 ceph-mon[80107]: pgmap v417: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Jan 26 09:56:24 compute-1 python3.9[183403]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:56:25 compute-1 sudo[183553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxvzxconpgdhmggbguufxqtvufrrriex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421384.816253-1642-192164753084171/AnsiballZ_stat.py'
Jan 26 09:56:25 compute-1 sudo[183553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003700 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003700 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:25 compute-1 python3.9[183555]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:56:25 compute-1 sudo[183553]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003700 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 09:56:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:56:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:25.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:56:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:25.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:26 compute-1 sudo[183696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qubpncwhmuycfqzeshyasdeqjjnmmdec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421384.816253-1642-192164753084171/AnsiballZ_copy.py'
Jan 26 09:56:26 compute-1 sudo[183696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:26 compute-1 podman[183655]: 2026-01-26 09:56:26.047489478 +0000 UTC m=+0.092049243 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 26 09:56:26 compute-1 ceph-mon[80107]: pgmap v418: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 09:56:26 compute-1 python3.9[183702]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769421384.816253-1642-192164753084171/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:26 compute-1 sudo[183696]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:26 compute-1 sudo[183852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcozymrgmhyjzxqlvyfgmmxxieoqtiaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421386.3845415-1642-137592759078000/AnsiballZ_stat.py'
Jan 26 09:56:26 compute-1 sudo[183852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:26 compute-1 python3.9[183854]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:56:26 compute-1 sudo[183852]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:27 compute-1 sudo[183977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnmhszxmxngriqpylwghnxfmtdytzomm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421386.3845415-1642-137592759078000/AnsiballZ_copy.py'
Jan 26 09:56:27 compute-1 sudo[183977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff14000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:27 compute-1 python3.9[183979]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769421386.3845415-1642-137592759078000/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:27 compute-1 sudo[183977]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003700 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:27.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:56:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:27.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.869740) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421387869799, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 289, "num_deletes": 250, "total_data_size": 124510, "memory_usage": 130432, "flush_reason": "Manual Compaction"}
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421387872745, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 81538, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17911, "largest_seqno": 18195, "table_properties": {"data_size": 79610, "index_size": 156, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 5188, "raw_average_key_size": 19, "raw_value_size": 75849, "raw_average_value_size": 281, "num_data_blocks": 7, "num_entries": 269, "num_filter_entries": 269, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769421384, "oldest_key_time": 1769421384, "file_creation_time": 1769421387, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 3033 microseconds, and 1041 cpu microseconds.
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.872782) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 81538 bytes OK
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.872802) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.874299) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.874316) EVENT_LOG_v1 {"time_micros": 1769421387874311, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.874335) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 122369, prev total WAL file size 122369, number of live WAL files 2.
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.874760) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323530' seq:72057594037927935, type:22 .. '6D67727374617400353031' seq:0, type:0; will stop at (end)
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(79KB)], [30(14MB)]
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421387874819, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 15662027, "oldest_snapshot_seqno": -1}
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4828 keys, 11602143 bytes, temperature: kUnknown
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421387968773, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 11602143, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11569428, "index_size": 19549, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12101, "raw_key_size": 122148, "raw_average_key_size": 25, "raw_value_size": 11481336, "raw_average_value_size": 2378, "num_data_blocks": 814, "num_entries": 4828, "num_filter_entries": 4828, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769421387, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.969011) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 11602143 bytes
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.970539) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 166.6 rd, 123.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 14.9 +0.0 blob) out(11.1 +0.0 blob), read-write-amplify(334.4) write-amplify(142.3) OK, records in: 5336, records dropped: 508 output_compression: NoCompression
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.970558) EVENT_LOG_v1 {"time_micros": 1769421387970549, "job": 16, "event": "compaction_finished", "compaction_time_micros": 94018, "compaction_time_cpu_micros": 28674, "output_level": 6, "num_output_files": 1, "total_output_size": 11602143, "num_input_records": 5336, "num_output_records": 4828, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421387970688, "job": 16, "event": "table_file_deletion", "file_number": 32}
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421387974338, "job": 16, "event": "table_file_deletion", "file_number": 30}
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.874686) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.974391) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.974396) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.974398) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.974399) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:56:27 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.974401) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:56:27 compute-1 sudo[184130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pemgkhmjmlsdhogqpojytheqgmtshfaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421387.7144494-1642-191245863988166/AnsiballZ_stat.py'
Jan 26 09:56:27 compute-1 sudo[184130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:28 compute-1 ceph-mon[80107]: pgmap v419: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 09:56:28 compute-1 python3.9[184132]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:56:28 compute-1 sudo[184130]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:28 compute-1 sudo[184255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yojjhxyglactyatyhejokprgwviuwszt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421387.7144494-1642-191245863988166/AnsiballZ_copy.py'
Jan 26 09:56:28 compute-1 sudo[184255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:28 compute-1 python3.9[184257]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769421387.7144494-1642-191245863988166/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:28 compute-1 sudo[184255]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:29 compute-1 sudo[184407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbbwhpjypumladeahfqkwhpbpwioslrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421388.9698806-1642-32913076201140/AnsiballZ_stat.py'
Jan 26 09:56:29 compute-1 sudo[184407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003700 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:29 compute-1 python3.9[184409]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:56:29 compute-1 sudo[184407]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff140016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:56:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:29.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:56:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:29.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:29 compute-1 sudo[184533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlvllmorgvpugrlztkqswqlmrkzneltw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421388.9698806-1642-32913076201140/AnsiballZ_copy.py'
Jan 26 09:56:29 compute-1 sudo[184533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:30 compute-1 python3.9[184535]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769421388.9698806-1642-32913076201140/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:30 compute-1 sudo[184533]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:30 compute-1 ceph-mon[80107]: pgmap v420: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 09:56:30 compute-1 sudo[184685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayzyawlblalpbevyflwixttraaujojjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421390.260303-1642-201401313200375/AnsiballZ_stat.py'
Jan 26 09:56:30 compute-1 sudo[184685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:30 compute-1 python3.9[184687]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:56:30 compute-1 sudo[184685]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095631 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:56:31 compute-1 sudo[184810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdglejgqfqaofqetnryjzdzzkshhuhin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421390.260303-1642-201401313200375/AnsiballZ_copy.py'
Jan 26 09:56:31 compute-1 sudo[184810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:31 compute-1 kernel: ganesha.nfsd[163468]: segfault at 50 ip 00007effcac6e32e sp 00007eff38ff8210 error 4 in libntirpc.so.5.8[7effcac53000+2c000] likely on CPU 1 (core 0, socket 1)
Jan 26 09:56:31 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 26 09:56:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003700 fd 38 proxy ignored for local
Jan 26 09:56:31 compute-1 systemd[1]: Started Process Core Dump (PID 184813/UID 0).
Jan 26 09:56:31 compute-1 python3.9[184812]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769421390.260303-1642-201401313200375/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:31 compute-1 sudo[184810]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:31.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:31.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:31 compute-1 sudo[184965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-albtbbojtmahdspokahpzxdeumsoanyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421391.6138182-1642-202640730484380/AnsiballZ_stat.py'
Jan 26 09:56:31 compute-1 sudo[184965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:32 compute-1 python3.9[184967]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:56:32 compute-1 sudo[184965]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:32 compute-1 ceph-mon[80107]: pgmap v421: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:56:32 compute-1 systemd-coredump[184814]: Process 123723 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 74:
                                                    #0  0x00007effcac6e32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Jan 26 09:56:32 compute-1 sudo[185091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccaucwntdvhmbsfwkalqwlnynnkzzmgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421391.6138182-1642-202640730484380/AnsiballZ_copy.py'
Jan 26 09:56:32 compute-1 sudo[185091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:32 compute-1 systemd[1]: systemd-coredump@4-184813-0.service: Deactivated successfully.
Jan 26 09:56:32 compute-1 systemd[1]: systemd-coredump@4-184813-0.service: Consumed 1.239s CPU time.
Jan 26 09:56:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:56:32 compute-1 podman[185097]: 2026-01-26 09:56:32.704729441 +0000 UTC m=+0.045421596 container died a0e54dfdc80639d6a858a302b9137bdb7eb4ae31fc0c5c95cbf6faee0fa517e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, OSD_FLAVOR=default, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 09:56:32 compute-1 systemd[1]: var-lib-containers-storage-overlay-50ad3cd59e5c5d65126cb5de338a6d9e374cbf09eb3d2dd944710879fec095a1-merged.mount: Deactivated successfully.
Jan 26 09:56:32 compute-1 podman[185097]: 2026-01-26 09:56:32.747321152 +0000 UTC m=+0.088013307 container remove a0e54dfdc80639d6a858a302b9137bdb7eb4ae31fc0c5c95cbf6faee0fa517e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:56:32 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Main process exited, code=exited, status=139/n/a
Jan 26 09:56:32 compute-1 python3.9[185093]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769421391.6138182-1642-202640730484380/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:32 compute-1 sudo[185091]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:32 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Failed with result 'exit-code'.
Jan 26 09:56:32 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 2.384s CPU time.
Jan 26 09:56:33 compute-1 sudo[185290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uprrnxfkgukzxqhritrwobrboffbpjmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421393.0110707-1642-162888978341113/AnsiballZ_stat.py'
Jan 26 09:56:33 compute-1 sudo[185290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:33 compute-1 python3.9[185292]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:56:33 compute-1 sudo[185290]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:33.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:33.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:33 compute-1 sudo[185414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdbtjinpoxspbdgwjnorudvfcqpmjyss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421393.0110707-1642-162888978341113/AnsiballZ_copy.py'
Jan 26 09:56:33 compute-1 sudo[185414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:34 compute-1 python3.9[185416]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769421393.0110707-1642-162888978341113/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:34 compute-1 sudo[185414]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:34 compute-1 ceph-mon[80107]: pgmap v422: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 09:56:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:56:34 compute-1 sudo[185568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcozrrtzzjjnbfemchfdwfvweiiaoieh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421394.3208778-1642-198377813614528/AnsiballZ_stat.py'
Jan 26 09:56:34 compute-1 sudo[185568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:34 compute-1 python3.9[185570]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:56:34 compute-1 sudo[185568]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:35 compute-1 sshd-session[185516]: Connection closed by authenticating user root 152.42.136.110 port 45940 [preauth]
Jan 26 09:56:35 compute-1 sudo[185693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibiqnuwxgccbllrqxpjwqxqnckyeqgpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421394.3208778-1642-198377813614528/AnsiballZ_copy.py'
Jan 26 09:56:35 compute-1 sudo[185693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:35 compute-1 python3.9[185695]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769421394.3208778-1642-198377813614528/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:35 compute-1 sudo[185693]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:35.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:35.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095636 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:56:36 compute-1 ceph-mon[80107]: pgmap v423: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 09:56:37 compute-1 sudo[185846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-halunkgsdyfqxtssubsejcqkwndfikuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421396.8748472-1981-277991922801203/AnsiballZ_command.py'
Jan 26 09:56:37 compute-1 sudo[185846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095637 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:56:37 compute-1 python3.9[185848]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 26 09:56:37 compute-1 sudo[185846]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:37.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:56:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:37.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:38 compute-1 sudo[186002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmhiqiazbzvntxchnqrgarmvtzvrjrrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421397.8524368-2008-52263767002489/AnsiballZ_file.py'
Jan 26 09:56:38 compute-1 sudo[186002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:38 compute-1 ceph-mon[80107]: pgmap v424: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:56:38 compute-1 sudo[186005]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:56:38 compute-1 sudo[186005]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:56:38 compute-1 sudo[186005]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:38 compute-1 python3.9[186004]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:38 compute-1 sudo[186002]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:38 compute-1 sshd-session[185851]: Connection closed by authenticating user root 178.62.249.31 port 52450 [preauth]
Jan 26 09:56:39 compute-1 sudo[186179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjtwqujbbzgofvjkaxfmitbviozxjtcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421398.6357222-2008-272578212377583/AnsiballZ_file.py'
Jan 26 09:56:39 compute-1 sudo[186179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:39 compute-1 python3.9[186181]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:39 compute-1 sudo[186179]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:39.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:39.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:39 compute-1 sudo[186332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aigegnxutewdldunaezniraexiesezwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421399.6360345-2008-21298947173351/AnsiballZ_file.py'
Jan 26 09:56:39 compute-1 sudo[186332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:40 compute-1 python3.9[186334]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:40 compute-1 sudo[186332]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:40 compute-1 ceph-mon[80107]: pgmap v425: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:56:40 compute-1 sudo[186484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uecfnjjbxleywgqvxqxwmjlisgpjutxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421400.27641-2008-170878105955202/AnsiballZ_file.py'
Jan 26 09:56:40 compute-1 sudo[186484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:40 compute-1 python3.9[186486]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:40 compute-1 sudo[186484]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:41 compute-1 sudo[186636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvpvalnwwzgcnaamgdhtuqrrsviembgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421400.9740822-2008-198670636436975/AnsiballZ_file.py'
Jan 26 09:56:41 compute-1 sudo[186636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:41 compute-1 python3.9[186638]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:41 compute-1 sudo[186636]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:56:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:41.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:56:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:41.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:41 compute-1 sudo[186789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhlgraetfhhobgmeilydlyqydcfvneuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421401.6257896-2008-124135697466110/AnsiballZ_file.py'
Jan 26 09:56:41 compute-1 sudo[186789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:42 compute-1 python3.9[186791]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:42 compute-1 sudo[186789]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:42 compute-1 podman[186801]: 2026-01-26 09:56:42.303435152 +0000 UTC m=+0.078783279 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller)
Jan 26 09:56:42 compute-1 ceph-mon[80107]: pgmap v426: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:56:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:56:42 compute-1 sudo[186965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfsftxicscolnmfcgyomnlcxeohwpwxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421402.2889218-2008-227030004359706/AnsiballZ_file.py'
Jan 26 09:56:42 compute-1 sudo[186965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:42 compute-1 python3.9[186967]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:42 compute-1 sudo[186965]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:42 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Scheduled restart job, restart counter is at 5.
Jan 26 09:56:42 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:56:42 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 2.384s CPU time.
Jan 26 09:56:42 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 09:56:43 compute-1 podman[187084]: 2026-01-26 09:56:43.171683066 +0000 UTC m=+0.059991879 container create f91999dbe8ac46def68a1db73fc621f26c01213fbdcc9e21e2564db7aefa11b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 09:56:43 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7db4ed4d191b8d4606088a1397a8878c3b2be7577ad0f488114cb7161674c534/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 09:56:43 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7db4ed4d191b8d4606088a1397a8878c3b2be7577ad0f488114cb7161674c534/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 09:56:43 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7db4ed4d191b8d4606088a1397a8878c3b2be7577ad0f488114cb7161674c534/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 09:56:43 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7db4ed4d191b8d4606088a1397a8878c3b2be7577ad0f488114cb7161674c534/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 09:56:43 compute-1 podman[187084]: 2026-01-26 09:56:43.234054679 +0000 UTC m=+0.122363482 container init f91999dbe8ac46def68a1db73fc621f26c01213fbdcc9e21e2564db7aefa11b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 09:56:43 compute-1 podman[187084]: 2026-01-26 09:56:43.24111374 +0000 UTC m=+0.129422543 container start f91999dbe8ac46def68a1db73fc621f26c01213fbdcc9e21e2564db7aefa11b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 09:56:43 compute-1 bash[187084]: f91999dbe8ac46def68a1db73fc621f26c01213fbdcc9e21e2564db7aefa11b3
Jan 26 09:56:43 compute-1 podman[187084]: 2026-01-26 09:56:43.150180212 +0000 UTC m=+0.038489075 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:56:43 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:56:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:43 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 09:56:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:43 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 09:56:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:43 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 26 09:56:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:43 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 26 09:56:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:43 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 26 09:56:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:43 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 26 09:56:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:43 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 26 09:56:43 compute-1 sudo[187220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msqhcxbdhjxqbpkjimkukfyktjsemkod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421403.0510886-2008-216414864420828/AnsiballZ_file.py'
Jan 26 09:56:43 compute-1 sudo[187220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:43 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:56:43 compute-1 python3.9[187222]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:43 compute-1 sudo[187220]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:43.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:43 compute-1 sudo[187373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oerwxufryoewogskzfbqorydjbnulxdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421403.680877-2008-166851099953196/AnsiballZ_file.py'
Jan 26 09:56:43 compute-1 sudo[187373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:43.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:44 compute-1 python3.9[187375]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:44 compute-1 sudo[187373]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:44 compute-1 ceph-mon[80107]: pgmap v427: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Jan 26 09:56:44 compute-1 sudo[187525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unbhxxuoveimhtwllphlxjzxwjblyllr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421404.2856743-2008-79488314755618/AnsiballZ_file.py'
Jan 26 09:56:44 compute-1 sudo[187525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:44 compute-1 python3.9[187527]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:44 compute-1 sudo[187525]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:45 compute-1 sudo[187677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzuimnyblmxlyuqrxyflplzpdolwcjmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421404.9355254-2008-113055590710821/AnsiballZ_file.py'
Jan 26 09:56:45 compute-1 sudo[187677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:45 compute-1 python3.9[187679]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:45 compute-1 sudo[187677]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:45.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:45.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:46 compute-1 auditd[700]: Audit daemon rotating log files
Jan 26 09:56:46 compute-1 sudo[187830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnzfkubxuksvgqgsmngrghsxvzlxtozw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421405.6672525-2008-14921533207982/AnsiballZ_file.py'
Jan 26 09:56:46 compute-1 sudo[187830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:46 compute-1 python3.9[187832]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:46 compute-1 sudo[187830]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:46 compute-1 ceph-mon[80107]: pgmap v428: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 170 B/s wr, 1 op/s
Jan 26 09:56:46 compute-1 sudo[187982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogqfvumuskqiveireyolpfvjvdcvjyhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421406.4574792-2008-169443479180327/AnsiballZ_file.py'
Jan 26 09:56:46 compute-1 sudo[187982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:47 compute-1 python3.9[187984]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:47 compute-1 sudo[187982]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:47.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:56:47 compute-1 sudo[188135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuxeyowxefpsmzkfufoqtuoqmosiccag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421407.3822803-2008-197627074709223/AnsiballZ_file.py'
Jan 26 09:56:47 compute-1 sudo[188135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:47 compute-1 python3.9[188137]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:47 compute-1 sudo[188135]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:47.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:48 compute-1 ceph-mon[80107]: pgmap v429: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 170 B/s wr, 1 op/s
Jan 26 09:56:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:49 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:56:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:49 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:56:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:56:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:49.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:49 compute-1 sudo[188288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njlyebmpudyutqlqbqlguugaqkwddfyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421409.541111-2305-105801427888392/AnsiballZ_stat.py'
Jan 26 09:56:49 compute-1 sudo[188288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:56:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:49.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:56:50 compute-1 python3.9[188290]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:56:50 compute-1 sudo[188288]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:50 compute-1 sudo[188411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbbfmeclldlnbvlfbzwkhjctwzczqkoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421409.541111-2305-105801427888392/AnsiballZ_copy.py'
Jan 26 09:56:50 compute-1 sudo[188411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:50 compute-1 ceph-mon[80107]: pgmap v430: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 170 B/s wr, 1 op/s
Jan 26 09:56:50 compute-1 python3.9[188413]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421409.541111-2305-105801427888392/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:50 compute-1 sudo[188411]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:51 compute-1 sudo[188563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohltaoqhepqzolxjxtzbfjpxlggamnjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421410.9390295-2305-157806444756689/AnsiballZ_stat.py'
Jan 26 09:56:51 compute-1 sudo[188563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:51 compute-1 python3.9[188565]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:56:51 compute-1 sudo[188563]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:51 compute-1 ceph-mon[80107]: pgmap v431: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 09:56:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:56:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:51.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:56:51 compute-1 sudo[188687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-styuwdkgviymvbzkjbunknbrkvuqsnwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421410.9390295-2305-157806444756689/AnsiballZ_copy.py'
Jan 26 09:56:51 compute-1 sudo[188687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:56:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:51.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:56:52 compute-1 python3.9[188689]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421410.9390295-2305-157806444756689/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:52 compute-1 sudo[188687]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:52 compute-1 sudo[188839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvkbjuwwrersivmtthclsnzqtzwfytbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421412.2024925-2305-102792998276836/AnsiballZ_stat.py'
Jan 26 09:56:52 compute-1 sudo[188839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:56:52 compute-1 python3.9[188841]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:56:52 compute-1 sudo[188839]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:53 compute-1 sudo[188962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olfgjtxykpajjystwrullbolafrkqfqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421412.2024925-2305-102792998276836/AnsiballZ_copy.py'
Jan 26 09:56:53 compute-1 sudo[188962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:53 compute-1 python3.9[188964]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421412.2024925-2305-102792998276836/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:53 compute-1 sudo[188962]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:53.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:53 compute-1 sudo[189115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hawaqchiymwvhfezfbmehvhayhhvmwey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421413.445095-2305-24697996549822/AnsiballZ_stat.py'
Jan 26 09:56:53 compute-1 sudo[189115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:56:53.917 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:56:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:56:53.918 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:56:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:56:53.918 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:56:53 compute-1 ceph-mon[80107]: pgmap v432: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 09:56:53 compute-1 python3.9[189117]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:56:53 compute-1 sudo[189115]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:53.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:54 compute-1 sudo[189238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bagwccffqtufcowetgkhvtaqfxfsayzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421413.445095-2305-24697996549822/AnsiballZ_copy.py'
Jan 26 09:56:54 compute-1 sudo[189238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:54 compute-1 python3.9[189240]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421413.445095-2305-24697996549822/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:54 compute-1 sudo[189238]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:55 compute-1 sudo[189390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bolhzaifgvkyartuygaobqzoeddlyylp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421414.7412686-2305-26470474788187/AnsiballZ_stat.py'
Jan 26 09:56:55 compute-1 sudo[189390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:55 compute-1 python3.9[189392]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:56:55 compute-1 sudo[189390]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd110000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f8000da0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:55 compute-1 sudo[189529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlopwbhcjxqotyprtqdpkdgkdirbkppw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421414.7412686-2305-26470474788187/AnsiballZ_copy.py'
Jan 26 09:56:55 compute-1 sudo[189529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:55.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:55 compute-1 python3.9[189531]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421414.7412686-2305-26470474788187/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:55 compute-1 sudo[189529]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:55 compute-1 ceph-mon[80107]: pgmap v433: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.1 KiB/s rd, 1.1 KiB/s wr, 4 op/s
Jan 26 09:56:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:55.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:56 compute-1 podman[189631]: 2026-01-26 09:56:56.30894044 +0000 UTC m=+0.069932259 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 09:56:56 compute-1 sudo[189701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnbzfcxckgqqkwfljumgtwyvhkwdyjsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421415.9786768-2305-257256167709928/AnsiballZ_stat.py'
Jan 26 09:56:56 compute-1 sudo[189701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:56 compute-1 python3.9[189703]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:56:56 compute-1 sudo[189701]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:57 compute-1 sudo[189824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aalvjybtsgyxqqifqcfamrociljjvwlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421415.9786768-2305-257256167709928/AnsiballZ_copy.py'
Jan 26 09:56:57 compute-1 sudo[189824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:57 compute-1 python3.9[189826]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421415.9786768-2305-257256167709928/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:57 compute-1 sudo[189824]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:57 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc000da0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:57 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f8000da0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:57 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4000fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:57.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:56:57 compute-1 sudo[189977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxmucgdnjtzhnpbkxwxjbrpcovdzozlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421417.404612-2305-154213677875054/AnsiballZ_stat.py'
Jan 26 09:56:57 compute-1 sudo[189977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:57.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:57 compute-1 ceph-mon[80107]: pgmap v434: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 09:56:58 compute-1 python3.9[189979]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:56:58 compute-1 sudo[189977]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095658 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:56:58 compute-1 sudo[190118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iecrlbfheoklwthltndmqeujqclrcjur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421417.404612-2305-154213677875054/AnsiballZ_copy.py'
Jan 26 09:56:58 compute-1 sudo[190086]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:56:58 compute-1 sudo[190086]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:56:58 compute-1 sudo[190118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:58 compute-1 sudo[190086]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:58 compute-1 python3.9[190127]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421417.404612-2305-154213677875054/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:56:58 compute-1 sudo[190118]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:59 compute-1 sudo[190277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whdjrmycxajztomodlbkuimbzphewtix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421418.8900807-2305-92383411143530/AnsiballZ_stat.py'
Jan 26 09:56:59 compute-1 sudo[190277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095659 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:56:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:59 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8000d00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:59 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc001ab0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:59 compute-1 python3.9[190279]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:56:59 compute-1 sudo[190277]: pam_unix(sudo:session): session closed for user root
Jan 26 09:56:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:59 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc001ab0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:56:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:59.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:56:59 compute-1 sudo[190401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzaoqeuptzhzhaddfxqjjjrohhwguvtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421418.8900807-2305-92383411143530/AnsiballZ_copy.py'
Jan 26 09:56:59 compute-1 sudo[190401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:56:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:56:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:56:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:59.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:00 compute-1 ceph-mon[80107]: pgmap v435: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 09:57:00 compute-1 python3.9[190403]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421418.8900807-2305-92383411143530/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:00 compute-1 sudo[190401]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:00 compute-1 sudo[190553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdtupodjcrhvgzgkccgbzrarlaseoucr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421420.258999-2305-106366403099726/AnsiballZ_stat.py'
Jan 26 09:57:00 compute-1 sudo[190553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:00 compute-1 python3.9[190555]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:57:00 compute-1 sudo[190553]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:01 compute-1 sudo[190676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkuqmyorzmwjwmixrmjrbrjourwmkhga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421420.258999-2305-106366403099726/AnsiballZ_copy.py'
Jan 26 09:57:01 compute-1 sudo[190676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:01 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:01 compute-1 python3.9[190678]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421420.258999-2305-106366403099726/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:01 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8001820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:01 compute-1 sudo[190676]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:01 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc001ab0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:57:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:01.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:57:01 compute-1 sudo[190829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngmvfnilemlyxahjvxbmigwuautoydkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421421.6240442-2305-272277287284044/AnsiballZ_stat.py'
Jan 26 09:57:01 compute-1 sudo[190829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:57:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:01.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:57:02 compute-1 ceph-mon[80107]: pgmap v436: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 09:57:02 compute-1 python3.9[190831]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:57:02 compute-1 sudo[190829]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:57:02 compute-1 sudo[190952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rygrlkxaamnuzhttzsenioodqffhqffo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421421.6240442-2305-272277287284044/AnsiballZ_copy.py'
Jan 26 09:57:02 compute-1 sudo[190952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:02 compute-1 python3.9[190954]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421421.6240442-2305-272277287284044/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:02 compute-1 sudo[190952]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:03 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f8001ea0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:03 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:03 compute-1 sudo[191105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuajquetdruthctsyvlvfayrpsiyhodr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421423.0960348-2305-147349478184363/AnsiballZ_stat.py'
Jan 26 09:57:03 compute-1 sudo[191105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:03 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8001820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999956s ======
Jan 26 09:57:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:03.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999956s
Jan 26 09:57:03 compute-1 python3.9[191107]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:57:03 compute-1 sudo[191105]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999955s ======
Jan 26 09:57:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:03.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999955s
Jan 26 09:57:04 compute-1 ceph-mon[80107]: pgmap v437: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 170 B/s wr, 1 op/s
Jan 26 09:57:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:57:04 compute-1 sudo[191228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwnzmyeidafdfuwbgznlayluiubogvxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421423.0960348-2305-147349478184363/AnsiballZ_copy.py'
Jan 26 09:57:04 compute-1 sudo[191228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:04 compute-1 python3.9[191230]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421423.0960348-2305-147349478184363/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:04 compute-1 sudo[191228]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:05 compute-1 sudo[191380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaimhsymgpzjniuossachtdqimgpidxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421424.7271461-2305-186578840990222/AnsiballZ_stat.py'
Jan 26 09:57:05 compute-1 sudo[191380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:05 compute-1 python3.9[191382]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:57:05 compute-1 sudo[191380]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:05 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc001ab0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:05 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:05 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999956s ======
Jan 26 09:57:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:05.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999956s
Jan 26 09:57:05 compute-1 sudo[191504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axalsxnbgjtyyphpxkwfxsrdprsbaatt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421424.7271461-2305-186578840990222/AnsiballZ_copy.py'
Jan 26 09:57:05 compute-1 sudo[191504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:05.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:06 compute-1 ceph-mon[80107]: pgmap v438: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 170 B/s wr, 1 op/s
Jan 26 09:57:06 compute-1 python3.9[191506]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421424.7271461-2305-186578840990222/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:06 compute-1 sudo[191504]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:06 compute-1 sudo[191656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jewymdgaelsxshijrqhhddvznqsfpotg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421426.3383253-2305-240542570516077/AnsiballZ_stat.py'
Jan 26 09:57:06 compute-1 sudo[191656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:06 compute-1 python3.9[191658]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:57:06 compute-1 sudo[191656]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:07 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8001820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:07 compute-1 sudo[191779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnlgnmdisvugdqrnihejmyunfjjpjlyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421426.3383253-2305-240542570516077/AnsiballZ_copy.py'
Jan 26 09:57:07 compute-1 sudo[191779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:07 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc002fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:07 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:07 compute-1 python3.9[191781]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421426.3383253-2305-240542570516077/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:57:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:07.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:07 compute-1 sudo[191779]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:07.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:08 compute-1 ceph-mon[80107]: pgmap v439: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 26 09:57:08 compute-1 sudo[191933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smygmqkqiytusgzclksdpxhyxntngqej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421427.9182436-2305-241615058389781/AnsiballZ_stat.py'
Jan 26 09:57:08 compute-1 sudo[191933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:08 compute-1 python3.9[191936]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:57:08 compute-1 sudo[191933]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:08 compute-1 sshd-session[191932]: Invalid user ubuntu from 104.248.198.71 port 51380
Jan 26 09:57:08 compute-1 sshd-session[191932]: Connection closed by invalid user ubuntu 104.248.198.71 port 51380 [preauth]
Jan 26 09:57:08 compute-1 sudo[192057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfyfdvxjnoctztfvoltsrgrdvzfhyrpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421427.9182436-2305-241615058389781/AnsiballZ_copy.py'
Jan 26 09:57:08 compute-1 sudo[192057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:09 compute-1 python3.9[192059]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421427.9182436-2305-241615058389781/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:09 compute-1 sudo[192057]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:09 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:09 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8002cb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:09 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc002fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:09.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:10.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:10 compute-1 python3.9[192210]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:57:10 compute-1 ceph-mon[80107]: pgmap v440: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 26 09:57:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:11 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:11 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:11 compute-1 sudo[192364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfdrdipuqcilzqknygmfcvxfbpwxmitx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421431.058993-2923-10718524440794/AnsiballZ_seboolean.py'
Jan 26 09:57:11 compute-1 sudo[192364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:11 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8002cb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999956s ======
Jan 26 09:57:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:11.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999956s
Jan 26 09:57:11 compute-1 python3.9[192366]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 26 09:57:11 compute-1 ceph-mon[80107]: pgmap v441: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 26 09:57:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999955s ======
Jan 26 09:57:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:12.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999955s
Jan 26 09:57:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:57:13 compute-1 sudo[192364]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095713 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:57:13 compute-1 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 26 09:57:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:13 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc002fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:13 compute-1 podman[192395]: 2026-01-26 09:57:13.40109565 +0000 UTC m=+0.157416331 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 26 09:57:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:13 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:13 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999956s ======
Jan 26 09:57:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:13.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999956s
Jan 26 09:57:13 compute-1 sudo[192548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mojsomadifircivliijzjmyddnypdvha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421433.3465059-2947-183111392926515/AnsiballZ_copy.py'
Jan 26 09:57:13 compute-1 sudo[192548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:13 compute-1 python3.9[192550]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:13 compute-1 sudo[192548]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:13 compute-1 ceph-mon[80107]: pgmap v442: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:57:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:14.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:14 compute-1 sudo[192700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhaflgkssfhseuocxsdtnhoqnhfwtark ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421434.1371603-2947-140870852961197/AnsiballZ_copy.py'
Jan 26 09:57:14 compute-1 sudo[192700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:14 compute-1 python3.9[192702]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:14 compute-1 sudo[192700]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:15 compute-1 sudo[192852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txjpguygqctkqmmizrdwxtcyhkqfjtlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421434.9183652-2947-128508381053098/AnsiballZ_copy.py'
Jan 26 09:57:15 compute-1 sudo[192852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:15 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8002cb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:15 compute-1 python3.9[192854]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:15 compute-1 sudo[192852]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:15 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc002fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:15 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999956s ======
Jan 26 09:57:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:15.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999956s
Jan 26 09:57:15 compute-1 ceph-mon[80107]: pgmap v443: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 09:57:16 compute-1 sudo[193005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhxbyuhyloecfpgxiigahikgmaanjepl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421435.6116269-2947-279059149686055/AnsiballZ_copy.py'
Jan 26 09:57:16 compute-1 sudo[193005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999956s ======
Jan 26 09:57:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:16.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999956s
Jan 26 09:57:16 compute-1 python3.9[193007]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:16 compute-1 sudo[193005]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:16 compute-1 sudo[193157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhtiatafvswbkzivyhrhduolduovzzym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421436.4430594-2947-126477268468833/AnsiballZ_copy.py'
Jan 26 09:57:16 compute-1 sudo[193157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:17 compute-1 python3.9[193159]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:17 compute-1 sudo[193157]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:17 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:17 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:17 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:17 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:57:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:17.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:17 compute-1 sudo[193312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llvdfzkkxlavqfyrmhygmzsebeduefbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421437.3643186-3055-244799526351063/AnsiballZ_copy.py'
Jan 26 09:57:17 compute-1 sudo[193312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:17 compute-1 python3.9[193314]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:17 compute-1 sudo[193312]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:18.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:18 compute-1 ceph-mon[80107]: pgmap v444: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:57:18 compute-1 sshd-session[193266]: Connection closed by authenticating user root 152.42.136.110 port 33208 [preauth]
Jan 26 09:57:18 compute-1 sudo[193391]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:57:18 compute-1 sudo[193391]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:57:18 compute-1 sudo[193391]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:18 compute-1 sudo[193439]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 09:57:18 compute-1 sudo[193439]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:57:18 compute-1 sudo[193537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmvcchwurjcsdqjwxysfidxvsusxapvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421438.1924634-3055-214681687644031/AnsiballZ_copy.py'
Jan 26 09:57:18 compute-1 sudo[193537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:18 compute-1 sudo[193492]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:57:18 compute-1 sudo[193492]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:57:18 compute-1 sudo[193492]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:18 compute-1 python3.9[193540]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:18 compute-1 sudo[193537]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:57:19 compute-1 sudo[193439]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:19 compute-1 sudo[193722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hydmdeeaolqsrmklhgyzsayhgeosnqzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421438.9383872-3055-187237627418754/AnsiballZ_copy.py'
Jan 26 09:57:19 compute-1 sudo[193722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:19 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:19 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:19 compute-1 python3.9[193724]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:19 compute-1 sudo[193722]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:19 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999956s ======
Jan 26 09:57:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:19.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999956s
Jan 26 09:57:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999956s ======
Jan 26 09:57:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:20.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999956s
Jan 26 09:57:20 compute-1 ceph-mon[80107]: pgmap v445: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:57:20 compute-1 sudo[193875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kanhxewbpftcogxoeqoxzxnllsdlbsyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421439.766041-3055-270964423212399/AnsiballZ_copy.py'
Jan 26 09:57:20 compute-1 sudo[193875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:20 compute-1 python3.9[193877]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:20 compute-1 sudo[193875]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:21 compute-1 sudo[194027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiqkaduiodiqlxnqebagijpgazntaciu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421440.6112847-3055-162371733807773/AnsiballZ_copy.py'
Jan 26 09:57:21 compute-1 sudo[194027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:21 compute-1 python3.9[194029]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:21 compute-1 sudo[194027]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:21 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:21 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:21 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:21.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:21 compute-1 sudo[194180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-expoqentvytmxwpbhdyhyfadlnrcbris ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421441.58014-3163-181510920113101/AnsiballZ_systemd.py'
Jan 26 09:57:21 compute-1 sudo[194180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:22.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:22 compute-1 ceph-mon[80107]: pgmap v446: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:57:22 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:57:22 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:57:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:22 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:57:22 compute-1 python3.9[194182]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 09:57:22 compute-1 systemd[1]: Reloading.
Jan 26 09:57:22 compute-1 systemd-rc-local-generator[194209]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:57:22 compute-1 systemd-sysv-generator[194212]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:57:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:57:22 compute-1 systemd[1]: Starting libvirt logging daemon socket...
Jan 26 09:57:22 compute-1 systemd[1]: Listening on libvirt logging daemon socket.
Jan 26 09:57:22 compute-1 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 26 09:57:22 compute-1 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 26 09:57:22 compute-1 systemd[1]: Starting libvirt logging daemon...
Jan 26 09:57:22 compute-1 systemd[1]: Started libvirt logging daemon.
Jan 26 09:57:22 compute-1 sudo[194180]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:23 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:57:23 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 09:57:23 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:57:23 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:57:23 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 09:57:23 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 09:57:23 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:57:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:23 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:23 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc002fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:23 compute-1 sudo[194373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwvzlcadlflfnfykcxfgvpxlvnqxnphf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421443.1362488-3163-271836002584891/AnsiballZ_systemd.py'
Jan 26 09:57:23 compute-1 sudo[194373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:23 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001999912s ======
Jan 26 09:57:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:23.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001999912s
Jan 26 09:57:23 compute-1 python3.9[194375]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 09:57:23 compute-1 systemd[1]: Reloading.
Jan 26 09:57:23 compute-1 systemd-rc-local-generator[194405]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:57:23 compute-1 systemd-sysv-generator[194408]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:57:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:24.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:24 compute-1 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 26 09:57:24 compute-1 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 26 09:57:24 compute-1 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 26 09:57:24 compute-1 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 26 09:57:24 compute-1 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 26 09:57:24 compute-1 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 26 09:57:24 compute-1 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 26 09:57:24 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Jan 26 09:57:24 compute-1 systemd[1]: Started libvirt nodedev daemon.
Jan 26 09:57:24 compute-1 sudo[194373]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:24 compute-1 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 26 09:57:24 compute-1 ceph-mon[80107]: pgmap v447: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:57:24 compute-1 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 26 09:57:24 compute-1 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 26 09:57:24 compute-1 sudo[194600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmjwpffmlxrjjuqbvevcarlrzrorbnxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421444.5543866-3163-70812320220743/AnsiballZ_systemd.py'
Jan 26 09:57:24 compute-1 sudo[194600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:25 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:57:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:25 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:57:25 compute-1 python3.9[194602]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 09:57:25 compute-1 systemd[1]: Reloading.
Jan 26 09:57:25 compute-1 sshd-session[194376]: Connection closed by authenticating user root 178.62.249.31 port 46570 [preauth]
Jan 26 09:57:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:25 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:25 compute-1 systemd-rc-local-generator[194632]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:57:25 compute-1 systemd-sysv-generator[194636]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:57:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:25 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:25 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc002fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:25 compute-1 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 26 09:57:25 compute-1 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 26 09:57:25 compute-1 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 26 09:57:25 compute-1 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 26 09:57:25 compute-1 ceph-mon[80107]: pgmap v448: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 26 09:57:25 compute-1 systemd[1]: Starting libvirt proxy daemon...
Jan 26 09:57:25 compute-1 systemd[1]: Started libvirt proxy daemon.
Jan 26 09:57:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999956s ======
Jan 26 09:57:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:25.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999956s
Jan 26 09:57:25 compute-1 sudo[194600]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:25 compute-1 setroubleshoot[194415]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a476e045-7b7a-4a8e-94a5-35aae6c63fc9
Jan 26 09:57:25 compute-1 setroubleshoot[194415]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 26 09:57:25 compute-1 setroubleshoot[194415]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a476e045-7b7a-4a8e-94a5-35aae6c63fc9
Jan 26 09:57:25 compute-1 setroubleshoot[194415]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 26 09:57:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:26.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:26 compute-1 sudo[194815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkwnnjphkdoinwtxbdzpfkegbahvviij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421445.9191086-3163-58340602786455/AnsiballZ_systemd.py'
Jan 26 09:57:26 compute-1 sudo[194815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:26 compute-1 python3.9[194817]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 09:57:26 compute-1 systemd[1]: Reloading.
Jan 26 09:57:26 compute-1 systemd-rc-local-generator[194859]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:57:26 compute-1 systemd-sysv-generator[194862]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:57:26 compute-1 podman[194819]: 2026-01-26 09:57:26.784493739 +0000 UTC m=+0.111848067 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 26 09:57:27 compute-1 systemd[1]: Listening on libvirt locking daemon socket.
Jan 26 09:57:27 compute-1 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 26 09:57:27 compute-1 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 26 09:57:27 compute-1 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 26 09:57:27 compute-1 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 26 09:57:27 compute-1 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 26 09:57:27 compute-1 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 26 09:57:27 compute-1 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 26 09:57:27 compute-1 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 26 09:57:27 compute-1 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 26 09:57:27 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Jan 26 09:57:27 compute-1 systemd[1]: Started libvirt QEMU daemon.
Jan 26 09:57:27 compute-1 sudo[194815]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:27 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:27 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:27 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd104001cd0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:57:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:27.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:27 compute-1 sudo[195052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdnbowynbrhwjxjhvuhsneydmrztzteu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421447.3809183-3163-53460213381856/AnsiballZ_systemd.py'
Jan 26 09:57:27 compute-1 sudo[195052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:28 compute-1 ceph-mon[80107]: pgmap v449: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 26 09:57:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:28.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:28 compute-1 python3.9[195054]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 09:57:28 compute-1 systemd[1]: Reloading.
Jan 26 09:57:28 compute-1 systemd-rc-local-generator[195081]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:57:28 compute-1 systemd-sysv-generator[195084]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:57:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:28 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 09:57:28 compute-1 sudo[195086]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 09:57:28 compute-1 sudo[195086]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:57:28 compute-1 sudo[195086]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:28 compute-1 systemd[1]: Starting libvirt secret daemon socket...
Jan 26 09:57:28 compute-1 systemd[1]: Listening on libvirt secret daemon socket.
Jan 26 09:57:28 compute-1 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 26 09:57:28 compute-1 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 26 09:57:28 compute-1 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 26 09:57:28 compute-1 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 26 09:57:28 compute-1 systemd[1]: Starting libvirt secret daemon...
Jan 26 09:57:28 compute-1 systemd[1]: Started libvirt secret daemon.
Jan 26 09:57:28 compute-1 sudo[195052]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:29 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:57:29 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:57:29 compute-1 sudo[195289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuypappcarflqeyqxvyvlhxirnhpnnoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421449.0013707-3274-110021914492456/AnsiballZ_file.py'
Jan 26 09:57:29 compute-1 sudo[195289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:29 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc002fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:29 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:29 compute-1 python3.9[195291]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:29 compute-1 sudo[195289]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:29 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:57:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:29.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:57:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:30.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:30 compute-1 ceph-mon[80107]: pgmap v450: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 26 09:57:30 compute-1 sudo[195442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luoqaommvndeomkaeppwiqqdlwdonnkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421449.8611348-3298-279585662853750/AnsiballZ_find.py'
Jan 26 09:57:30 compute-1 sudo[195442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:30 compute-1 python3.9[195444]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 09:57:30 compute-1 sudo[195442]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:30 compute-1 sudo[195594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnvtcqwynsdehobotjsmzzrskrsfyakh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421450.6417124-3322-180891333056766/AnsiballZ_command.py'
Jan 26 09:57:30 compute-1 sudo[195594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:31 compute-1 python3.9[195596]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;
                                             echo ceph
                                             awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:57:31 compute-1 sudo[195594]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:31 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:31 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc002fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:31 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:31.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:57:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:32.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:57:32 compute-1 python3.9[195751]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 09:57:32 compute-1 ceph-mon[80107]: pgmap v451: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:57:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:57:33 compute-1 python3.9[195901]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:57:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:33 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:33 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd104001fd0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:33 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc002fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:33.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:33 compute-1 python3.9[196023]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421452.6144001-3379-247936633090530/.source.xml follow=False _original_basename=secret.xml.j2 checksum=8bb860fb1574c7989940fddd89a1bc8580864aba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:57:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:34.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:57:34 compute-1 ceph-mon[80107]: pgmap v452: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Jan 26 09:57:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:57:34 compute-1 sudo[196173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybournhusmzskmlfbazyvmcoyafhusph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421454.1342456-3424-89453900848513/AnsiballZ_command.py'
Jan 26 09:57:34 compute-1 sudo[196173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:34 compute-1 python3.9[196175]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 1a70b85d-e3fd-5814-8a6a-37ea00fcae30
                                             virsh secret-define --file /tmp/secret.xml
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:57:34 compute-1 polkitd[43555]: Registered Authentication Agent for unix-process:196177:341522 (system bus name :1.1868 [pkttyagent --process 196177 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Jan 26 09:57:34 compute-1 polkitd[43555]: Unregistered Authentication Agent for unix-process:196177:341522 (system bus name :1.1868, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Jan 26 09:57:34 compute-1 polkitd[43555]: Registered Authentication Agent for unix-process:196176:341522 (system bus name :1.1869 [pkttyagent --process 196176 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Jan 26 09:57:34 compute-1 polkitd[43555]: Unregistered Authentication Agent for unix-process:196176:341522 (system bus name :1.1869, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Jan 26 09:57:34 compute-1 sudo[196173]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095735 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:57:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:35 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:35 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:35 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd1040028f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:35.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:35 compute-1 python3.9[196338]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:36 compute-1 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 26 09:57:36 compute-1 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.152s CPU time.
Jan 26 09:57:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:36.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:36 compute-1 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 26 09:57:36 compute-1 ceph-mon[80107]: pgmap v453: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 26 09:57:36 compute-1 sudo[196488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdivoauxugemkxgqtihjffhfogdmlgsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421456.0176842-3472-40896157958197/AnsiballZ_command.py'
Jan 26 09:57:36 compute-1 sudo[196488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:36 compute-1 sudo[196488]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:37 compute-1 sudo[196641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwbdrgyvlarekhdwkyqrycilmooaoujg ; FSID=1a70b85d-e3fd-5814-8a6a-37ea00fcae30 KEY=AQDlNXdpAAAAABAAkYdaCUlKVeiqmlhElLFrLA== /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421456.9312656-3496-240924126905466/AnsiballZ_command.py'
Jan 26 09:57:37 compute-1 sudo[196641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:37 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd1040028f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:37 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:37 compute-1 polkitd[43555]: Registered Authentication Agent for unix-process:196644:341807 (system bus name :1.1872 [pkttyagent --process 196644 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Jan 26 09:57:37 compute-1 polkitd[43555]: Unregistered Authentication Agent for unix-process:196644:341807 (system bus name :1.1872, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Jan 26 09:57:37 compute-1 sudo[196641]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:37 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:57:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:57:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:37.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:57:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:38.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:38 compute-1 ceph-mon[80107]: pgmap v454: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 09:57:38 compute-1 sudo[196800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yngckzwdpqvjiwlkmicbleodndhgsfwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421457.926454-3520-132213323670876/AnsiballZ_copy.py'
Jan 26 09:57:38 compute-1 sudo[196800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:38 compute-1 python3.9[196802]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:38 compute-1 sudo[196800]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:38 compute-1 sudo[196803]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:57:38 compute-1 sudo[196803]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:57:38 compute-1 sudo[196803]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:39 compute-1 sudo[196977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rthjlakuickrigjwvocwkndrxvdqplnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421458.8163602-3544-230765070869054/AnsiballZ_stat.py'
Jan 26 09:57:39 compute-1 sudo[196977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:39 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd1040028f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:39 compute-1 python3.9[196979]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:57:39 compute-1 sudo[196977]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:39 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd1040028f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:39 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:57:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:39.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:57:39 compute-1 sudo[197101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygfsnygfghdczkhojiolxtthghehpjlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421458.8163602-3544-230765070869054/AnsiballZ_copy.py'
Jan 26 09:57:39 compute-1 sudo[197101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:57:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:40.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:57:40 compute-1 python3.9[197103]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421458.8163602-3544-230765070869054/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:40 compute-1 sudo[197101]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:40 compute-1 ceph-mon[80107]: pgmap v455: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 09:57:40 compute-1 sudo[197253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdgzcfhlwvutcjhrtccpbjlaamdzaqto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421460.506974-3592-154355219898506/AnsiballZ_file.py'
Jan 26 09:57:40 compute-1 sudo[197253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:41 compute-1 python3.9[197255]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:41 compute-1 sudo[197253]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:41 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:41 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd1040028f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:41 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd1040028f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:41.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:57:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:42.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:57:42 compute-1 sudo[197406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhbdklquoqddnlkjqtkgsvicqrblftmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421461.7497523-3616-119774046958691/AnsiballZ_stat.py'
Jan 26 09:57:42 compute-1 sudo[197406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:42 compute-1 ceph-mon[80107]: pgmap v456: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 09:57:42 compute-1 python3.9[197408]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:57:42 compute-1 sudo[197406]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:57:42 compute-1 sudo[197484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxlejqgkvebafhwifzbuqiatbwcnkwmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421461.7497523-3616-119774046958691/AnsiballZ_file.py'
Jan 26 09:57:42 compute-1 sudo[197484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:42 compute-1 python3.9[197486]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:43 compute-1 sudo[197484]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:43 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:43 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:43 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd1040028f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:43 compute-1 sudo[197654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzufxgdvlauztuuotndfxpnbcqtqjtai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421463.2155595-3652-210264432653408/AnsiballZ_stat.py'
Jan 26 09:57:43 compute-1 sudo[197654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:43 compute-1 podman[197611]: 2026-01-26 09:57:43.654488443 +0000 UTC m=+0.104179637 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:57:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:43.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:43 compute-1 python3.9[197661]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:57:43 compute-1 sudo[197654]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:44.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:44 compute-1 sudo[197741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvphkrgztmvgemxyopdwyforlzozhcex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421463.2155595-3652-210264432653408/AnsiballZ_file.py'
Jan 26 09:57:44 compute-1 sudo[197741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:44 compute-1 ceph-mon[80107]: pgmap v457: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 26 09:57:44 compute-1 python3.9[197743]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.wjj7r4l8 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:44 compute-1 sudo[197741]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:45 compute-1 sudo[197893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imiezwohjeymyjerlhgddbwuaboyamhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421464.6398172-3688-139491250753807/AnsiballZ_stat.py'
Jan 26 09:57:45 compute-1 sudo[197893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:45 compute-1 python3.9[197895]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:57:45 compute-1 sudo[197893]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:45 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd1040028f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:45 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:45 compute-1 sudo[197972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfeozpbqiijkchngzhsxjugedijluvdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421464.6398172-3688-139491250753807/AnsiballZ_file.py'
Jan 26 09:57:45 compute-1 sudo[197972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:45 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:57:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:45.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:57:45 compute-1 python3.9[197974]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:45 compute-1 sudo[197972]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:57:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:46.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:57:46 compute-1 ceph-mon[80107]: pgmap v458: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 26 09:57:46 compute-1 sudo[198124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weobotbwenoxbxydziajiqbsyfymhbws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421466.1808002-3727-7130687072552/AnsiballZ_command.py'
Jan 26 09:57:46 compute-1 sudo[198124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:46 compute-1 python3.9[198126]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:57:46 compute-1 sudo[198124]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:47 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd1040028f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:47 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc003e70 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:47 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc003e70 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:47 compute-1 sudo[198278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gaxiwjpnmyyvfmogjitmefbsrrwzknln ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769421467.1717432-3751-218610750117794/AnsiballZ_edpm_nftables_from_files.py'
Jan 26 09:57:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:57:47 compute-1 sudo[198278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:47.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:47 compute-1 python3[198280]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 26 09:57:47 compute-1 sudo[198278]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:48.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:48 compute-1 ceph-mon[80107]: pgmap v459: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:57:48 compute-1 sudo[198430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfcjwjizqokynjwkcxuksjrtcrykernr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421468.1823118-3775-75166236948929/AnsiballZ_stat.py'
Jan 26 09:57:48 compute-1 sudo[198430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:48 compute-1 python3.9[198432]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:57:48 compute-1 sudo[198430]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:49 compute-1 sudo[198508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zddkzcmptxqlslsqsvngemrexctysccn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421468.1823118-3775-75166236948929/AnsiballZ_file.py'
Jan 26 09:57:49 compute-1 sudo[198508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:49 compute-1 python3.9[198510]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:49 compute-1 sudo[198508]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:49 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:57:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:49 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd1040028f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:49 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc003e70 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:57:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:57:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:49.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:57:50 compute-1 sudo[198661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqwoesiukgmwzbnokfscylltajsoxobe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421469.5502386-3811-173750776529031/AnsiballZ_stat.py'
Jan 26 09:57:50 compute-1 sudo[198661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:50.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:50 compute-1 python3.9[198663]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:57:50 compute-1 sudo[198661]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:50 compute-1 ceph-mon[80107]: pgmap v460: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:57:50 compute-1 sudo[198786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlyztdphjdewowjmpzkilywnhfgkcegs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421469.5502386-3811-173750776529031/AnsiballZ_copy.py'
Jan 26 09:57:50 compute-1 sudo[198786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:50 compute-1 python3.9[198788]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421469.5502386-3811-173750776529031/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:50 compute-1 sudo[198786]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:51 compute-1 kernel: ganesha.nfsd[189468]: segfault at 50 ip 00007fd192f2d32e sp 00007fd118ff8210 error 4 in libntirpc.so.5.8[7fd192f12000+2c000] likely on CPU 2 (core 0, socket 2)
Jan 26 09:57:51 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 26 09:57:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:51 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc003e70 fd 37 proxy ignored for local
Jan 26 09:57:51 compute-1 systemd[1]: Started Process Core Dump (PID 198911/UID 0).
Jan 26 09:57:51 compute-1 sudo[198940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afopgkhnagbhnacqeptsgvuihuxajfig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421471.1383688-3856-224670070249662/AnsiballZ_stat.py'
Jan 26 09:57:51 compute-1 sudo[198940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:51 compute-1 python3.9[198943]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:57:51 compute-1 sudo[198940]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:51.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:52 compute-1 sudo[199019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xakmigahzkntwruwxoxqhinztbpsyrla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421471.1383688-3856-224670070249662/AnsiballZ_file.py'
Jan 26 09:57:52 compute-1 sudo[199019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:52.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:52 compute-1 python3.9[199021]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:52 compute-1 sudo[199019]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:52 compute-1 ceph-mon[80107]: pgmap v461: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 26 09:57:52 compute-1 systemd-coredump[198913]: Process 187149 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 45:
                                                    #0  0x00007fd192f2d32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Jan 26 09:57:52 compute-1 systemd[1]: systemd-coredump@5-198911-0.service: Deactivated successfully.
Jan 26 09:57:52 compute-1 systemd[1]: systemd-coredump@5-198911-0.service: Consumed 1.218s CPU time.
Jan 26 09:57:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:57:52 compute-1 podman[199050]: 2026-01-26 09:57:52.749455206 +0000 UTC m=+0.047676070 container died f91999dbe8ac46def68a1db73fc621f26c01213fbdcc9e21e2564db7aefa11b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Jan 26 09:57:52 compute-1 systemd[1]: var-lib-containers-storage-overlay-7db4ed4d191b8d4606088a1397a8878c3b2be7577ad0f488114cb7161674c534-merged.mount: Deactivated successfully.
Jan 26 09:57:52 compute-1 podman[199050]: 2026-01-26 09:57:52.799785506 +0000 UTC m=+0.098006310 container remove f91999dbe8ac46def68a1db73fc621f26c01213fbdcc9e21e2564db7aefa11b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid)
Jan 26 09:57:52 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Main process exited, code=exited, status=139/n/a
Jan 26 09:57:53 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Failed with result 'exit-code'.
Jan 26 09:57:53 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.690s CPU time.
Jan 26 09:57:53 compute-1 sudo[199217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adwgjijnvidhgzfjtyjcnwqcvwdkhipv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421472.9247425-3892-155246134370517/AnsiballZ_stat.py'
Jan 26 09:57:53 compute-1 sudo[199217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:53 compute-1 python3.9[199219]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:57:53 compute-1 sudo[199217]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:57:53.919 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:57:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:57:53.920 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:57:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:57:53.920 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:57:53 compute-1 sudo[199296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zeksmyqsrdrkdyiinuzevhursyefhsfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421472.9247425-3892-155246134370517/AnsiballZ_file.py'
Jan 26 09:57:53 compute-1 sudo[199296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:53.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:57:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:54.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:57:54 compute-1 python3.9[199298]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:54 compute-1 sudo[199296]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:54 compute-1 ceph-mon[80107]: pgmap v462: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:57:54 compute-1 sudo[199448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwsmqxypbkqasqcxnvjcrogvrwcjqtdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421474.5091093-3928-51725510179494/AnsiballZ_stat.py'
Jan 26 09:57:54 compute-1 sudo[199448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:55 compute-1 python3.9[199450]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:57:55 compute-1 sudo[199448]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:55 compute-1 sudo[199574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tykupmbiphhnoawrvcuiivhjanlucjrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421474.5091093-3928-51725510179494/AnsiballZ_copy.py'
Jan 26 09:57:55 compute-1 sudo[199574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:55 compute-1 python3.9[199576]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421474.5091093-3928-51725510179494/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:55 compute-1 sudo[199574]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:55.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:57:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:56.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:57:56 compute-1 ceph-mon[80107]: pgmap v463: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 09:57:56 compute-1 sudo[199726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spxoodeccindvejoqbuxzsmljbzwjmks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421476.3642154-3973-279532961489096/AnsiballZ_file.py'
Jan 26 09:57:56 compute-1 sudo[199726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:56 compute-1 python3.9[199728]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:56 compute-1 sudo[199726]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:57 compute-1 podman[199753]: 2026-01-26 09:57:57.312034395 +0000 UTC m=+0.084852354 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 26 09:57:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095757 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:57:57 compute-1 sudo[199900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qefjsrzfjjdckfeysstvpwvjgqpoiqot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421477.2912662-3997-175360572607939/AnsiballZ_command.py'
Jan 26 09:57:57 compute-1 sudo[199900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:57:57 compute-1 sshd-session[199824]: Invalid user ubuntu from 104.248.198.71 port 49890
Jan 26 09:57:57 compute-1 sshd-session[199824]: Connection closed by invalid user ubuntu 104.248.198.71 port 49890 [preauth]
Jan 26 09:57:57 compute-1 python3.9[199902]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:57:57 compute-1 sudo[199900]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:57.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:57:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:58.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:57:58 compute-1 ceph-mon[80107]: pgmap v464: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:57:58 compute-1 sudo[200063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhiydnngapojkxpdylxmyepaksltsjbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421478.18785-4022-55141170530445/AnsiballZ_blockinfile.py'
Jan 26 09:57:58 compute-1 sudo[200063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:58 compute-1 sudo[200050]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:57:58 compute-1 sudo[200050]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:57:58 compute-1 sudo[200050]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:58 compute-1 python3.9[200080]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:57:58 compute-1 sudo[200063]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:59 compute-1 sudo[200233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwsbfuakilqydejdoqqtrkthjrssukys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421479.3597915-4048-94942032660292/AnsiballZ_command.py'
Jan 26 09:57:59 compute-1 sudo[200233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:57:59 compute-1 python3.9[200235]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:57:59 compute-1 sudo[200233]: pam_unix(sudo:session): session closed for user root
Jan 26 09:57:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:57:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:57:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:59.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:58:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 09:58:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:00.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 09:58:00 compute-1 ceph-mon[80107]: pgmap v465: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:58:00 compute-1 sudo[200388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcpqibwtcngzolzdpepmaaooptiybrxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421480.209494-4072-256779640259507/AnsiballZ_stat.py'
Jan 26 09:58:00 compute-1 sudo[200388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:00 compute-1 python3.9[200390]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:58:00 compute-1 sudo[200388]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:01 compute-1 sshd-session[200336]: Connection closed by authenticating user root 152.42.136.110 port 32834 [preauth]
Jan 26 09:58:01 compute-1 sudo[200542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eifjbdeiehqumaxoabpmxjputvtqcfiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421481.0960827-4096-251943015913172/AnsiballZ_command.py'
Jan 26 09:58:01 compute-1 sudo[200542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:01 compute-1 python3.9[200545]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:58:01 compute-1 sudo[200542]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:01 compute-1 ceph-mon[80107]: pgmap v466: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 09:58:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:58:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:01.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:58:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:02.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:02 compute-1 sudo[200698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlktkciccddlncjxhutwsuojeorlelne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421482.036981-4120-37770102782474/AnsiballZ_file.py'
Jan 26 09:58:02 compute-1 sudo[200698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:02 compute-1 python3.9[200700]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:58:02 compute-1 sudo[200698]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:58:03 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Scheduled restart job, restart counter is at 6.
Jan 26 09:58:03 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:58:03 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.690s CPU time.
Jan 26 09:58:03 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 09:58:03 compute-1 sudo[200864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acfltqejqkfuzuyjzgmrguirfhpopmom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421482.8390465-4144-196388026304540/AnsiballZ_stat.py'
Jan 26 09:58:03 compute-1 sudo[200864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:03 compute-1 podman[200902]: 2026-01-26 09:58:03.466108183 +0000 UTC m=+0.051168334 container create 47a793f86a4439f31eda22b535baa072091a02712074afe081570a9bf67e7cba (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 09:58:03 compute-1 python3.9[200873]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:58:03 compute-1 sudo[200864]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:03 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/517cd0d582ca3a4460bb9bc745e6e8235de97ea25035af73605c7c836f342957/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 09:58:03 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/517cd0d582ca3a4460bb9bc745e6e8235de97ea25035af73605c7c836f342957/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 09:58:03 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/517cd0d582ca3a4460bb9bc745e6e8235de97ea25035af73605c7c836f342957/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 09:58:03 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/517cd0d582ca3a4460bb9bc745e6e8235de97ea25035af73605c7c836f342957/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 09:58:03 compute-1 podman[200902]: 2026-01-26 09:58:03.44306077 +0000 UTC m=+0.028120911 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:58:03 compute-1 podman[200902]: 2026-01-26 09:58:03.555216121 +0000 UTC m=+0.140276252 container init 47a793f86a4439f31eda22b535baa072091a02712074afe081570a9bf67e7cba (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Jan 26 09:58:03 compute-1 podman[200902]: 2026-01-26 09:58:03.564471952 +0000 UTC m=+0.149532073 container start 47a793f86a4439f31eda22b535baa072091a02712074afe081570a9bf67e7cba (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 09:58:03 compute-1 bash[200902]: 47a793f86a4439f31eda22b535baa072091a02712074afe081570a9bf67e7cba
Jan 26 09:58:03 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:58:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:03 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 09:58:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:03 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 09:58:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:03.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:04 compute-1 sudo[201058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnxtsuvjfqozbfewuxihkadrsnrknkhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421482.8390465-4144-196388026304540/AnsiballZ_copy.py'
Jan 26 09:58:04 compute-1 sudo[201058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:04 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 26 09:58:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:04 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 26 09:58:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:04 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 26 09:58:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:04 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 26 09:58:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:04 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 26 09:58:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:04.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:04 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:58:04 compute-1 python3.9[201060]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421482.8390465-4144-196388026304540/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:58:04 compute-1 sudo[201058]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:04 compute-1 ceph-mon[80107]: pgmap v467: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:58:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:58:04 compute-1 sudo[201232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyfhzczymphjunvgxnwxduyaxtvyimye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421484.5003583-4189-57149880805252/AnsiballZ_stat.py'
Jan 26 09:58:04 compute-1 sudo[201232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:05 compute-1 python3.9[201234]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:58:05 compute-1 sudo[201232]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:05 compute-1 sudo[201356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxiykezdyglugmpoesbsnfokejnkgosl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421484.5003583-4189-57149880805252/AnsiballZ_copy.py'
Jan 26 09:58:05 compute-1 sudo[201356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:05 compute-1 python3.9[201358]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421484.5003583-4189-57149880805252/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:58:05 compute-1 sudo[201356]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:05.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:06 compute-1 ceph-mon[80107]: pgmap v468: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:58:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:06.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:06 compute-1 sudo[201508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzalbwfptmnjniqlgwbcrxfvrghjhxco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421486.0091622-4234-35548270332264/AnsiballZ_stat.py'
Jan 26 09:58:06 compute-1 sudo[201508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:06 compute-1 python3.9[201510]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:58:06 compute-1 sudo[201508]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:06 compute-1 sudo[201631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-viulaojanzmxchgtumnfvjymdtlpihmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421486.0091622-4234-35548270332264/AnsiballZ_copy.py'
Jan 26 09:58:06 compute-1 sudo[201631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:07 compute-1 python3.9[201633]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421486.0091622-4234-35548270332264/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:58:07 compute-1 sudo[201631]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:58:07 compute-1 sudo[201784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xugpqhlwyrkvdthjdczkfgflqzfzyhrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421487.6098747-4279-93592581144445/AnsiballZ_systemd.py'
Jan 26 09:58:07 compute-1 sudo[201784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:08.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:08.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:08 compute-1 python3.9[201786]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:58:08 compute-1 systemd[1]: Reloading.
Jan 26 09:58:08 compute-1 systemd-rc-local-generator[201812]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:58:08 compute-1 systemd-sysv-generator[201818]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:58:08 compute-1 systemd[1]: Reached target edpm_libvirt.target.
Jan 26 09:58:08 compute-1 sudo[201784]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:08 compute-1 ceph-mon[80107]: pgmap v469: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:58:09 compute-1 sudo[201974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpbrbfzctucflkjjptmemrzefpwfpfzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421488.915298-4303-199481908978438/AnsiballZ_systemd.py'
Jan 26 09:58:09 compute-1 sudo[201974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:09 compute-1 python3.9[201976]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 26 09:58:09 compute-1 systemd[1]: Reloading.
Jan 26 09:58:09 compute-1 systemd-rc-local-generator[202005]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:58:09 compute-1 systemd-sysv-generator[202008]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:58:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 09:58:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:10.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 09:58:10 compute-1 systemd[1]: Reloading.
Jan 26 09:58:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:10.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:10 compute-1 systemd-sysv-generator[202044]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:58:10 compute-1 systemd-rc-local-generator[202041]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:58:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:10 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:58:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:10 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:58:10 compute-1 sudo[201974]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:10 compute-1 ceph-mon[80107]: pgmap v470: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:58:10 compute-1 sshd-session[143930]: Connection closed by 192.168.122.30 port 53548
Jan 26 09:58:10 compute-1 sshd-session[143927]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:58:10 compute-1 systemd[1]: session-52.scope: Deactivated successfully.
Jan 26 09:58:10 compute-1 systemd[1]: session-52.scope: Consumed 3min 56.093s CPU time.
Jan 26 09:58:10 compute-1 systemd-logind[783]: Session 52 logged out. Waiting for processes to exit.
Jan 26 09:58:10 compute-1 systemd-logind[783]: Removed session 52.
Jan 26 09:58:11 compute-1 sshd-session[202050]: Connection closed by authenticating user root 178.62.249.31 port 51306 [preauth]
Jan 26 09:58:11 compute-1 ceph-mon[80107]: pgmap v471: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 26 09:58:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:12.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:12.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:58:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:14.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 09:58:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:14.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 09:58:14 compute-1 ceph-mon[80107]: pgmap v472: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 26 09:58:14 compute-1 podman[202078]: 2026-01-26 09:58:14.35405376 +0000 UTC m=+0.122663052 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 26 09:58:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095815 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:58:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 09:58:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:16.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 09:58:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:16.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:16 compute-1 ceph-mon[80107]: pgmap v473: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 2 op/s
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 26 09:58:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 09:58:16 compute-1 sshd-session[202106]: Accepted publickey for zuul from 192.168.122.30 port 57050 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 09:58:16 compute-1 systemd-logind[783]: New session 53 of user zuul.
Jan 26 09:58:16 compute-1 systemd[1]: Started Session 53 of User zuul.
Jan 26 09:58:16 compute-1 sshd-session[202106]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:58:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:17 : epoch 69773aab : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f31c0000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:17 : epoch 69773aab : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f31b8001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:17 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:58:17 compute-1 python3.9[202274]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:58:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:18 : epoch 69773aab : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f31a4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 09:58:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:18.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 09:58:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:18.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:18 compute-1 ceph-mon[80107]: pgmap v474: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 852 B/s wr, 2 op/s
Jan 26 09:58:18 compute-1 sudo[202379]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:58:18 compute-1 sudo[202379]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:58:18 compute-1 sudo[202379]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:19 compute-1 python3.9[202454]: ansible-ansible.builtin.service_facts Invoked
Jan 26 09:58:19 compute-1 network[202471]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 09:58:19 compute-1 network[202472]: 'network-scripts' will be removed from distribution in near future.
Jan 26 09:58:19 compute-1 network[202473]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 09:58:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:58:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095819 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:58:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:19 : epoch 69773aab : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f319c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:19 : epoch 69773aab : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f31c4001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:20 : epoch 69773aab : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f31b8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 09:58:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:20.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 09:58:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:20.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:20 compute-1 ceph-mon[80107]: pgmap v475: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 852 B/s wr, 2 op/s
Jan 26 09:58:21 compute-1 kernel: ganesha.nfsd[202111]: segfault at 50 ip 00007f324c40932e sp 00007f31d27fb210 error 4 in libntirpc.so.5.8[7f324c3ee000+2c000] likely on CPU 4 (core 0, socket 4)
Jan 26 09:58:21 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 26 09:58:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:21 : epoch 69773aab : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f31b8002520 fd 38 proxy ignored for local
Jan 26 09:58:21 compute-1 systemd[1]: Started Process Core Dump (PID 202534/UID 0).
Jan 26 09:58:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 09:58:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:22.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 09:58:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:22.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:22 compute-1 ceph-mon[80107]: pgmap v476: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 852 B/s wr, 2 op/s
Jan 26 09:58:22 compute-1 systemd-coredump[202535]: Process 200926 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 43:
                                                    #0  0x00007f324c40932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Jan 26 09:58:22 compute-1 systemd[1]: systemd-coredump@6-202534-0.service: Deactivated successfully.
Jan 26 09:58:22 compute-1 systemd[1]: systemd-coredump@6-202534-0.service: Consumed 1.146s CPU time.
Jan 26 09:58:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:58:22 compute-1 podman[202552]: 2026-01-26 09:58:22.684971921 +0000 UTC m=+0.051553829 container died 47a793f86a4439f31eda22b535baa072091a02712074afe081570a9bf67e7cba (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 09:58:22 compute-1 systemd[1]: var-lib-containers-storage-overlay-517cd0d582ca3a4460bb9bc745e6e8235de97ea25035af73605c7c836f342957-merged.mount: Deactivated successfully.
Jan 26 09:58:22 compute-1 podman[202552]: 2026-01-26 09:58:22.731099559 +0000 UTC m=+0.097681407 container remove 47a793f86a4439f31eda22b535baa072091a02712074afe081570a9bf67e7cba (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 09:58:22 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Main process exited, code=exited, status=139/n/a
Jan 26 09:58:22 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Failed with result 'exit-code'.
Jan 26 09:58:22 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.697s CPU time.
Jan 26 09:58:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 09:58:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:24.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 09:58:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 09:58:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:24.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 09:58:24 compute-1 ceph-mon[80107]: pgmap v477: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 341 B/s wr, 1 op/s
Jan 26 09:58:24 compute-1 sudo[202796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqehmezwyiakcnojyxkqvcbkkcvxhftv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421504.602911-97-39962738484221/AnsiballZ_setup.py'
Jan 26 09:58:24 compute-1 sudo[202796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:25 compute-1 python3.9[202798]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 09:58:25 compute-1 sudo[202796]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:26.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:26 compute-1 sudo[202881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovhyeqppzobftoabihvlpzhaitnzbtqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421504.602911-97-39962738484221/AnsiballZ_dnf.py'
Jan 26 09:58:26 compute-1 sudo[202881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:26.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:26 compute-1 ceph-mon[80107]: pgmap v478: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 09:58:26 compute-1 python3.9[202883]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:58:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095827 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:58:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:58:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 09:58:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:28.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 09:58:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:28.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:28 compute-1 podman[202886]: 2026-01-26 09:58:28.312899848 +0000 UTC m=+0.095745730 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 26 09:58:28 compute-1 ceph-mon[80107]: pgmap v479: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:58:28 compute-1 sudo[202906]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:58:28 compute-1 sudo[202906]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:58:28 compute-1 sudo[202906]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:28 compute-1 sudo[202931]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 09:58:28 compute-1 sudo[202931]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:58:29 compute-1 sudo[202931]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:30.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:30.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:30 compute-1 ceph-mon[80107]: pgmap v480: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 26 09:58:30 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:58:30 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 09:58:30 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:58:30 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:58:30 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 09:58:30 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 09:58:30 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:58:31 compute-1 sudo[202881]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 09:58:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:32.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 09:58:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:32.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:32 compute-1 ceph-mon[80107]: pgmap v481: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 09:58:32 compute-1 sudo[203138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxjjufkjqictmfpcgltxyqaquhbsasie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421512.1173365-133-75715281890044/AnsiballZ_stat.py'
Jan 26 09:58:32 compute-1 sudo[203138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:58:32 compute-1 python3.9[203140]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:58:32 compute-1 sudo[203138]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:33 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Scheduled restart job, restart counter is at 7.
Jan 26 09:58:33 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:58:33 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.697s CPU time.
Jan 26 09:58:33 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 09:58:33 compute-1 podman[203289]: 2026-01-26 09:58:33.559859285 +0000 UTC m=+0.058404942 container create cbf27e002fa6e7d390d75e124dda783fcdb8c722ca7000472b152c8413a63b8a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Jan 26 09:58:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/18b268ab60ff9aff8b27d65bea4ca79576ef17e6e07f8e0cca83a86a786d46ca/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 09:58:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/18b268ab60ff9aff8b27d65bea4ca79576ef17e6e07f8e0cca83a86a786d46ca/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 09:58:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/18b268ab60ff9aff8b27d65bea4ca79576ef17e6e07f8e0cca83a86a786d46ca/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 09:58:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/18b268ab60ff9aff8b27d65bea4ca79576ef17e6e07f8e0cca83a86a786d46ca/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 09:58:33 compute-1 sudo[203349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hskfghqoxsswutlyohxaplimtwxznxze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421513.1343148-163-214908882206793/AnsiballZ_command.py'
Jan 26 09:58:33 compute-1 sudo[203349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:33 compute-1 podman[203289]: 2026-01-26 09:58:33.540796551 +0000 UTC m=+0.039342228 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 09:58:33 compute-1 podman[203289]: 2026-01-26 09:58:33.634956333 +0000 UTC m=+0.133502040 container init cbf27e002fa6e7d390d75e124dda783fcdb8c722ca7000472b152c8413a63b8a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 09:58:33 compute-1 podman[203289]: 2026-01-26 09:58:33.641207532 +0000 UTC m=+0.139753199 container start cbf27e002fa6e7d390d75e124dda783fcdb8c722ca7000472b152c8413a63b8a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Jan 26 09:58:33 compute-1 bash[203289]: cbf27e002fa6e7d390d75e124dda783fcdb8c722ca7000472b152c8413a63b8a
Jan 26 09:58:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 09:58:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 09:58:33 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 09:58:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 26 09:58:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 26 09:58:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 26 09:58:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 26 09:58:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 26 09:58:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:58:33 compute-1 python3.9[203356]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:58:33 compute-1 sudo[203349]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:34 compute-1 sudo[203421]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 09:58:34 compute-1 sudo[203421]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:58:34 compute-1 sudo[203421]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 09:58:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:34.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 09:58:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 09:58:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:34.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 09:58:34 compute-1 ceph-mon[80107]: pgmap v482: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 425 B/s wr, 1 op/s
Jan 26 09:58:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:58:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:58:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:58:34 compute-1 sudo[203571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbahxxxtjbhlvbrhvevsbzzcopdicqrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421514.2250412-193-246882373692717/AnsiballZ_stat.py'
Jan 26 09:58:34 compute-1 sudo[203571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:34 compute-1 python3.9[203573]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:58:34 compute-1 sudo[203571]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:35 compute-1 sudo[203723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrxpyprmcdhkvjgwvwvjfzrtqrhtghld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421514.9948697-217-167142349735787/AnsiballZ_command.py'
Jan 26 09:58:35 compute-1 sudo[203723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:35 compute-1 python3.9[203725]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:58:35 compute-1 sudo[203723]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:36.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:36 compute-1 sudo[203877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-doalbwsfakpmbdeuuknflhrndmwpeytl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421515.7588308-241-59159069202668/AnsiballZ_stat.py'
Jan 26 09:58:36 compute-1 sudo[203877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 09:58:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:36.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 09:58:36 compute-1 python3.9[203879]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:58:36 compute-1 sudo[203877]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:36 compute-1 ceph-mon[80107]: pgmap v483: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 765 B/s wr, 3 op/s
Jan 26 09:58:36 compute-1 sudo[204000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhzxhdfauvlcoskiwfpyiwayeijwtshv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421515.7588308-241-59159069202668/AnsiballZ_copy.py'
Jan 26 09:58:36 compute-1 sudo[204000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:36 compute-1 python3.9[204002]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421515.7588308-241-59159069202668/.source.iscsi _original_basename=.pj379ukk follow=False checksum=e8ca1be981ffe6ba52507fee4e91187479634914 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:58:37 compute-1 sudo[204000]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:37 compute-1 ceph-mon[80107]: pgmap v484: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 680 B/s wr, 2 op/s
Jan 26 09:58:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:58:37 compute-1 sudo[204153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdlwrxpxshyanntgwfgttwzuhkocnhgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421517.271745-286-65979102916146/AnsiballZ_file.py'
Jan 26 09:58:37 compute-1 sudo[204153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:38.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:38 compute-1 python3.9[204155]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:58:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:38.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:38 compute-1 sudo[204153]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:38 compute-1 sudo[204305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyfrveytkjuomzoptspkgdhoacoaymfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421518.3579557-310-251782141458360/AnsiballZ_lineinfile.py'
Jan 26 09:58:38 compute-1 sudo[204305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:38 compute-1 sudo[204308]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:58:38 compute-1 sudo[204308]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:58:38 compute-1 sudo[204308]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:38 compute-1 python3.9[204307]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:58:39 compute-1 sudo[204305]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Jan 26 09:58:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Jan 26 09:58:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:58:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:58:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 09:58:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:58:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:58:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:58:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 09:58:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:40 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:58:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:40 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:58:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:40 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:58:40 compute-1 ceph-mon[80107]: pgmap v485: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 680 B/s wr, 2 op/s
Jan 26 09:58:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 09:58:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:40.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 09:58:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 09:58:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:40.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 09:58:40 compute-1 sudo[204483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufdsgtayqvjzdgldmhrnejrkbeegdufq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421519.4627979-337-216070433939262/AnsiballZ_systemd_service.py'
Jan 26 09:58:40 compute-1 sudo[204483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:40 compute-1 python3.9[204485]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:58:40 compute-1 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 26 09:58:40 compute-1 sudo[204483]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:41 compute-1 sudo[204639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfaxadfejfodwhrtperldetmsvstlpwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421520.836152-361-246755722944874/AnsiballZ_systemd_service.py'
Jan 26 09:58:41 compute-1 sudo[204639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:41 compute-1 python3.9[204641]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:58:41 compute-1 systemd[1]: Reloading.
Jan 26 09:58:41 compute-1 systemd-rc-local-generator[204672]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:58:41 compute-1 systemd-sysv-generator[204676]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:58:41 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 26 09:58:41 compute-1 systemd[1]: Starting Open-iSCSI...
Jan 26 09:58:41 compute-1 kernel: Loading iSCSI transport class v2.0-870.
Jan 26 09:58:41 compute-1 systemd[1]: Started Open-iSCSI.
Jan 26 09:58:41 compute-1 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 26 09:58:41 compute-1 sudo[204639]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:41 compute-1 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 26 09:58:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 09:58:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:42.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 09:58:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:42.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:58:42 compute-1 ceph-mon[80107]: pgmap v486: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 936 B/s wr, 3 op/s
Jan 26 09:58:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095843 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:58:43 compute-1 python3.9[204840]: ansible-ansible.builtin.service_facts Invoked
Jan 26 09:58:43 compute-1 network[204857]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 09:58:43 compute-1 network[204858]: 'network-scripts' will be removed from distribution in near future.
Jan 26 09:58:43 compute-1 network[204859]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 09:58:43 compute-1 ceph-mon[80107]: pgmap v487: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 595 B/s wr, 2 op/s
Jan 26 09:58:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:44.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:44.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:44 compute-1 podman[204875]: 2026-01-26 09:58:44.549537013 +0000 UTC m=+0.136528262 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 09:58:44 compute-1 sshd-session[204866]: Connection closed by authenticating user root 152.42.136.110 port 45450 [preauth]
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-000000000000001a:nfs.cephfs.0: -2
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 26 09:58:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 09:58:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 09:58:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:46.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 09:58:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:46.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:46 compute-1 ceph-mon[80107]: pgmap v488: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.5 KiB/s rd, 1.3 KiB/s wr, 5 op/s
Jan 26 09:58:46 compute-1 sshd-session[204956]: Invalid user ubuntu from 104.248.198.71 port 42794
Jan 26 09:58:46 compute-1 sshd-session[204956]: Connection closed by invalid user ubuntu 104.248.198.71 port 42794 [preauth]
Jan 26 09:58:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:47 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b44000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:47 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:58:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:48 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:48.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:48.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:48 compute-1 ceph-mon[80107]: pgmap v489: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:58:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:58:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095849 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:58:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:49 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:49 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b30001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:50 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:50 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:50.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:50.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:50 compute-1 ceph-mon[80107]: pgmap v490: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:58:50 compute-1 sudo[205177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfutdrpqdwmdhobvtkqtzifgvtkjzmnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421530.5149996-430-116857080869137/AnsiballZ_dnf.py'
Jan 26 09:58:50 compute-1 sudo[205177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:51 compute-1 python3.9[205179]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:58:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:51 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:51 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b180016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:52 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b30001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 09:58:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:52.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 09:58:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:52.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:52 compute-1 ceph-mon[80107]: pgmap v491: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:58:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:58:52 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Jan 26 09:58:52 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:52.964670) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 09:58:52 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Jan 26 09:58:52 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421532964730, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 1603, "num_deletes": 255, "total_data_size": 4107087, "memory_usage": 4167248, "flush_reason": "Manual Compaction"}
Jan 26 09:58:52 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Jan 26 09:58:52 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421532984900, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 2675326, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18200, "largest_seqno": 19798, "table_properties": {"data_size": 2668688, "index_size": 3773, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 13136, "raw_average_key_size": 18, "raw_value_size": 2655560, "raw_average_value_size": 3804, "num_data_blocks": 169, "num_entries": 698, "num_filter_entries": 698, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769421388, "oldest_key_time": 1769421388, "file_creation_time": 1769421532, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Jan 26 09:58:52 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 20297 microseconds, and 6359 cpu microseconds.
Jan 26 09:58:52 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 09:58:52 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:52.984964) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 2675326 bytes OK
Jan 26 09:58:52 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:52.984992) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Jan 26 09:58:52 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:52.986543) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Jan 26 09:58:52 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:52.986560) EVENT_LOG_v1 {"time_micros": 1769421532986555, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 09:58:52 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:52.986632) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 09:58:52 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 4099775, prev total WAL file size 4099775, number of live WAL files 2.
Jan 26 09:58:52 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 09:58:52 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:52.987882) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323532' seq:0, type:0; will stop at (end)
Jan 26 09:58:52 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 09:58:52 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(2612KB)], [33(11MB)]
Jan 26 09:58:52 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421532987953, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 14277469, "oldest_snapshot_seqno": -1}
Jan 26 09:58:53 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 5002 keys, 13831980 bytes, temperature: kUnknown
Jan 26 09:58:53 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421533054339, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 13831980, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13796599, "index_size": 21767, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12549, "raw_key_size": 126827, "raw_average_key_size": 25, "raw_value_size": 13704085, "raw_average_value_size": 2739, "num_data_blocks": 897, "num_entries": 5002, "num_filter_entries": 5002, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769421532, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Jan 26 09:58:53 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 09:58:53 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:53.055290) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 13831980 bytes
Jan 26 09:58:53 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:53.081875) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 213.1 rd, 206.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 11.1 +0.0 blob) out(13.2 +0.0 blob), read-write-amplify(10.5) write-amplify(5.2) OK, records in: 5526, records dropped: 524 output_compression: NoCompression
Jan 26 09:58:53 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:53.081920) EVENT_LOG_v1 {"time_micros": 1769421533081904, "job": 18, "event": "compaction_finished", "compaction_time_micros": 66997, "compaction_time_cpu_micros": 31288, "output_level": 6, "num_output_files": 1, "total_output_size": 13831980, "num_input_records": 5526, "num_output_records": 5002, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 09:58:53 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 09:58:53 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421533082566, "job": 18, "event": "table_file_deletion", "file_number": 35}
Jan 26 09:58:53 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 09:58:53 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421533084743, "job": 18, "event": "table_file_deletion", "file_number": 33}
Jan 26 09:58:53 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:52.987774) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:58:53 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:53.084802) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:58:53 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:53.084807) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:58:53 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:53.084809) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:58:53 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:53.084811) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:58:53 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:53.084813) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 09:58:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:53 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:53 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 09:58:53 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 26 09:58:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:53 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:53 compute-1 systemd[1]: Reloading.
Jan 26 09:58:53 compute-1 systemd-rc-local-generator[205225]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:58:53 compute-1 systemd-sysv-generator[205229]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:58:53 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 09:58:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:58:53.920 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:58:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:58:53.920 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:58:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:58:53.920 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:58:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:54 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b180016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:54.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:54 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 09:58:54 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 09:58:54 compute-1 systemd[1]: run-r61878c67b40549a48e0a13bfa14a249d.service: Deactivated successfully.
Jan 26 09:58:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:54.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:54 compute-1 sudo[205177]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:54 compute-1 ceph-mon[80107]: pgmap v492: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 767 B/s wr, 3 op/s
Jan 26 09:58:55 compute-1 sudo[205495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmkrueqrqgrokjsgeirzlwpgcctpzzdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421534.7275088-457-215748723127638/AnsiballZ_file.py'
Jan 26 09:58:55 compute-1 sudo[205495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:55 compute-1 python3.9[205497]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 26 09:58:55 compute-1 sudo[205495]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:55 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b30001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:55 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:55 compute-1 ceph-mon[80107]: pgmap v493: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 767 B/s wr, 3 op/s
Jan 26 09:58:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:56 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:56 compute-1 sudo[205649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkdnmcvjnmlqfrqlycjxqvgtaedytbvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421535.594724-481-273071479299282/AnsiballZ_modprobe.py'
Jan 26 09:58:56 compute-1 sudo[205649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:56.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:56.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:56 compute-1 python3.9[205651]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 26 09:58:56 compute-1 sudo[205649]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:56 compute-1 sudo[205806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnfmllsthvcukqbjxfzccqvujpaiukve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421536.6091857-505-153600535190746/AnsiballZ_stat.py'
Jan 26 09:58:56 compute-1 sudo[205806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:57 compute-1 python3.9[205808]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:58:57 compute-1 sudo[205806]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:57 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b180016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:57 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b30001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:58:57 compute-1 sudo[205930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajixvgglmmvwiokzkwecdiyptbhsjcza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421536.6091857-505-153600535190746/AnsiballZ_copy.py'
Jan 26 09:58:57 compute-1 sudo[205930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:57 compute-1 sshd-session[205645]: Connection closed by authenticating user root 178.62.249.31 port 44114 [preauth]
Jan 26 09:58:57 compute-1 python3.9[205932]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421536.6091857-505-153600535190746/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:58:57 compute-1 sudo[205930]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:58 compute-1 ceph-mon[80107]: pgmap v494: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Jan 26 09:58:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:58 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.003000070s ======
Jan 26 09:58:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:58.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000070s
Jan 26 09:58:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:58:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:58:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:58.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:58:58 compute-1 sudo[206091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nygpwkgguohuqciqicttyuolcbmhyabu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421538.2764435-553-175788213169371/AnsiballZ_lineinfile.py'
Jan 26 09:58:58 compute-1 sudo[206091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:58 compute-1 podman[206056]: 2026-01-26 09:58:58.671581018 +0000 UTC m=+0.085916966 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 09:58:58 compute-1 python3.9[206098]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:58:58 compute-1 sudo[206091]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:59 compute-1 sudo[206126]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:58:59 compute-1 sudo[206126]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:58:59 compute-1 sudo[206126]: pam_unix(sudo:session): session closed for user root
Jan 26 09:58:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:59 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:59 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:58:59 compute-1 sudo[206277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yognljovnuzgnvcmcqehvgazyeuzknkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421539.0758886-577-6462933483362/AnsiballZ_systemd.py'
Jan 26 09:58:59 compute-1 sudo[206277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:58:59 compute-1 python3.9[206279]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 09:59:00 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 26 09:59:00 compute-1 systemd[1]: Stopped Load Kernel Modules.
Jan 26 09:59:00 compute-1 systemd[1]: Stopping Load Kernel Modules...
Jan 26 09:59:00 compute-1 systemd[1]: Starting Load Kernel Modules...
Jan 26 09:59:00 compute-1 systemd[1]: Finished Load Kernel Modules.
Jan 26 09:59:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:00 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b300031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:00 compute-1 sudo[206277]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:00.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:00.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:00 compute-1 ceph-mon[80107]: pgmap v495: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Jan 26 09:59:00 compute-1 sudo[206433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcfdnoklieqzzrrxdvsjkfypoysasmwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421540.431973-601-275380636020145/AnsiballZ_command.py'
Jan 26 09:59:00 compute-1 sudo[206433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:00 compute-1 python3.9[206435]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:59:01 compute-1 sudo[206433]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:01 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:01 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:01 compute-1 anacron[4199]: Job `cron.monthly' started
Jan 26 09:59:01 compute-1 anacron[4199]: Job `cron.monthly' terminated
Jan 26 09:59:01 compute-1 anacron[4199]: Normal exit (3 jobs run)
Jan 26 09:59:02 compute-1 sudo[206589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khtcwclrhifkwlomthuhdgolvzhzuzkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421541.6859267-631-151956356107592/AnsiballZ_stat.py'
Jan 26 09:59:02 compute-1 sudo[206589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:02 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 09:59:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:02.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 09:59:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 09:59:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:02.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 09:59:02 compute-1 python3.9[206591]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:59:02 compute-1 ceph-mon[80107]: pgmap v496: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 26 09:59:02 compute-1 sudo[206589]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:59:02 compute-1 sudo[206741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdgtssjsbnokcmbcywhsydrzrbvtoywn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421542.606024-658-96448631358218/AnsiballZ_stat.py'
Jan 26 09:59:02 compute-1 sudo[206741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:03 compute-1 python3.9[206743]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:59:03 compute-1 sudo[206741]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:03 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b300031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:03 compute-1 sudo[206864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfnerymqykqomaekyxbwszetqyukeclw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421542.606024-658-96448631358218/AnsiballZ_copy.py'
Jan 26 09:59:03 compute-1 sudo[206864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:03 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:03 compute-1 python3.9[206866]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421542.606024-658-96448631358218/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:59:03 compute-1 sudo[206864]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:04 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 09:59:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:04.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 09:59:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:04.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:04 compute-1 sudo[207017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnoprcasfxhttzrpeqvwvnwyolwbtkri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421544.1861768-703-103187133060753/AnsiballZ_command.py'
Jan 26 09:59:04 compute-1 sudo[207017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:04 compute-1 ceph-mon[80107]: pgmap v497: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:59:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:59:04 compute-1 python3.9[207019]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:59:04 compute-1 sudo[207017]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:05 compute-1 sudo[207170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-parqwegvmyurlfhputyefopxvfcqexko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421545.0423434-727-150455004706369/AnsiballZ_lineinfile.py'
Jan 26 09:59:05 compute-1 sudo[207170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:05 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:05 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b300031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:05 compute-1 python3.9[207172]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:59:05 compute-1 sudo[207170]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:05 compute-1 ceph-mon[80107]: pgmap v498: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:59:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:06 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:06.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:06.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:06 compute-1 sudo[207323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfelztrvugkgwisfbopeghedvkbflsuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421545.8173258-751-82106513969814/AnsiballZ_replace.py'
Jan 26 09:59:06 compute-1 sudo[207323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:06 compute-1 python3.9[207325]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:59:06 compute-1 sudo[207323]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:07 compute-1 sudo[207475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvxxqzpmnhxahooxneenakkfgswxwsof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421546.9288025-775-50167464399126/AnsiballZ_replace.py'
Jan 26 09:59:07 compute-1 sudo[207475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:07 compute-1 python3.9[207477]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:59:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:07 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:07 compute-1 sudo[207475]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:07 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:59:08 compute-1 sudo[207628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jomlolpuncnegirjgwgtnfqaclnfdzam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421547.8198798-802-87349966771062/AnsiballZ_lineinfile.py'
Jan 26 09:59:08 compute-1 sudo[207628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:08 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:08 compute-1 ceph-mon[80107]: pgmap v499: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:59:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:08.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 09:59:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:08.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 09:59:08 compute-1 python3.9[207630]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:59:08 compute-1 sudo[207628]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:08 compute-1 sudo[207780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wteoqqpgozdyencpvtlzdjmzxmyutgwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421548.4415536-802-177936801273641/AnsiballZ_lineinfile.py'
Jan 26 09:59:08 compute-1 sudo[207780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:08 compute-1 python3.9[207782]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:59:08 compute-1 sudo[207780]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:09 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:09 compute-1 sudo[207932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltzleqtnveyhzlzjahkyjfdnungdhdew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421549.1566796-802-275837787119771/AnsiballZ_lineinfile.py'
Jan 26 09:59:09 compute-1 sudo[207932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:09 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:09 compute-1 python3.9[207935]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:59:09 compute-1 sudo[207932]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:10 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:10.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 09:59:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:10.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 09:59:10 compute-1 sudo[208085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcpzzhgocalivvrnjshtpsajkrkxwado ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421549.8764746-802-38919688433964/AnsiballZ_lineinfile.py'
Jan 26 09:59:10 compute-1 sudo[208085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:10 compute-1 python3.9[208087]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:59:10 compute-1 sudo[208085]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:10 compute-1 ceph-mon[80107]: pgmap v500: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:59:11 compute-1 sudo[208237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otkmqbtbjgcreajsryydkgzwgwvhoxpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421550.8447618-889-155136875462926/AnsiballZ_stat.py'
Jan 26 09:59:11 compute-1 sudo[208237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:11 compute-1 python3.9[208239]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 09:59:11 compute-1 sudo[208237]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:11 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:11 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:12 compute-1 sudo[208392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-laadsmsuogxmdqukjsrlidqmjvibakrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421551.7414012-913-232424758411713/AnsiballZ_command.py'
Jan 26 09:59:12 compute-1 sudo[208392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:12 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:12.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:12.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:12 compute-1 python3.9[208394]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 09:59:12 compute-1 sudo[208392]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:12 compute-1 ceph-mon[80107]: pgmap v501: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 26 09:59:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:59:12 compute-1 sudo[208545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ineslqoilkpjhgmvgsqsxbrixhqeduss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421552.6410918-940-56388259325359/AnsiballZ_systemd_service.py'
Jan 26 09:59:12 compute-1 sudo[208545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:13 compute-1 python3.9[208547]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:59:13 compute-1 systemd[1]: Listening on multipathd control socket.
Jan 26 09:59:13 compute-1 sudo[208545]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:13 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:13 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:14 compute-1 sudo[208702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlficndlihlozxnufjblokcfewcygslw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421553.6841872-964-208922841031868/AnsiballZ_systemd_service.py'
Jan 26 09:59:14 compute-1 sudo[208702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:14 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:14.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:14.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:14 compute-1 ceph-mon[80107]: pgmap v502: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:59:14 compute-1 python3.9[208704]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:59:14 compute-1 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 26 09:59:14 compute-1 udevadm[208709]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 26 09:59:14 compute-1 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 26 09:59:14 compute-1 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 26 09:59:14 compute-1 multipathd[208712]: --------start up--------
Jan 26 09:59:14 compute-1 multipathd[208712]: read /etc/multipath.conf
Jan 26 09:59:14 compute-1 multipathd[208712]: path checkers start up
Jan 26 09:59:14 compute-1 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 26 09:59:14 compute-1 sudo[208702]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:14 compute-1 podman[208720]: 2026-01-26 09:59:14.804629225 +0000 UTC m=+0.109464713 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 09:59:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:15 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:15 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:15 compute-1 sudo[208897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nveirchorjfiocmvxyuhwnixxvgnopcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421555.469052-1000-398082216933/AnsiballZ_file.py'
Jan 26 09:59:15 compute-1 sudo[208897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:16 compute-1 python3.9[208899]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 26 09:59:16 compute-1 sudo[208897]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:16 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000059s ======
Jan 26 09:59:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:16.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000059s
Jan 26 09:59:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:16.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:16 compute-1 ceph-mon[80107]: pgmap v503: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:59:16 compute-1 sudo[209049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkaasbaupugyyzflvggjavurxudiiiai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421556.2285302-1024-231349526130932/AnsiballZ_modprobe.py'
Jan 26 09:59:16 compute-1 sudo[209049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:16 compute-1 python3.9[209051]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 26 09:59:16 compute-1 kernel: Key type psk registered
Jan 26 09:59:16 compute-1 sudo[209049]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:17 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:17 compute-1 sudo[209212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqkgvncpcwaoflivpzvzqqwjmdzhrkjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421557.13052-1048-248105863941937/AnsiballZ_stat.py'
Jan 26 09:59:17 compute-1 sudo[209212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:17 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:17 compute-1 python3.9[209214]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 09:59:17 compute-1 sudo[209212]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:17 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:59:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:18 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:18 compute-1 sudo[209336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxsmqbmjaodvmreplwllrvtecyezpwof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421557.13052-1048-248105863941937/AnsiballZ_copy.py'
Jan 26 09:59:18 compute-1 sudo[209336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000059s ======
Jan 26 09:59:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:18.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000059s
Jan 26 09:59:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:18.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:18 compute-1 python3.9[209338]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421557.13052-1048-248105863941937/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:59:18 compute-1 sudo[209336]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:19 compute-1 sudo[209462]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:59:19 compute-1 sudo[209462]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:59:19 compute-1 sudo[209462]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:19 compute-1 sudo[209513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otyoswdpcfojdjwxelqnpijigrerzqug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421558.8279557-1096-10850583544459/AnsiballZ_lineinfile.py'
Jan 26 09:59:19 compute-1 sudo[209513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:19 compute-1 ceph-mon[80107]: pgmap v504: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:59:19 compute-1 python3.9[209515]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:59:19 compute-1 sudo[209513]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:19 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:19 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:20 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:20 compute-1 sudo[209667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqcwycwhkawuqjirnklqlwskolxauldb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421559.848893-1122-7175104610935/AnsiballZ_systemd.py'
Jan 26 09:59:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:20.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:20 compute-1 sudo[209667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000059s ======
Jan 26 09:59:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:20.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000059s
Jan 26 09:59:20 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:59:20 compute-1 ceph-mon[80107]: pgmap v505: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:59:20 compute-1 python3.9[209669]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 09:59:20 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 26 09:59:20 compute-1 systemd[1]: Stopped Load Kernel Modules.
Jan 26 09:59:20 compute-1 systemd[1]: Stopping Load Kernel Modules...
Jan 26 09:59:20 compute-1 systemd[1]: Starting Load Kernel Modules...
Jan 26 09:59:20 compute-1 systemd[1]: Finished Load Kernel Modules.
Jan 26 09:59:20 compute-1 sudo[209667]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:21 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:21 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:22 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:22 compute-1 sudo[209824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cowreanaagahukzzgcpfpuixoismhhzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421561.7117672-1144-124601391295581/AnsiballZ_dnf.py'
Jan 26 09:59:22 compute-1 sudo[209824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:22.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:22 compute-1 ceph-mon[80107]: pgmap v506: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 26 09:59:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:22.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:22 compute-1 python3.9[209826]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 09:59:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:59:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:23 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b140016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:23 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:24 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:24.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:24.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:24 compute-1 ceph-mon[80107]: pgmap v507: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:59:24 compute-1 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 26 09:59:25 compute-1 systemd[1]: Reloading.
Jan 26 09:59:25 compute-1 systemd-rc-local-generator[209859]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:59:25 compute-1 systemd-sysv-generator[209863]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:59:25 compute-1 systemd[1]: Reloading.
Jan 26 09:59:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095925 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 09:59:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:25 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:25 compute-1 systemd-rc-local-generator[209893]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:59:25 compute-1 systemd-sysv-generator[209898]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:59:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:25 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b140016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:25 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 26 09:59:25 compute-1 systemd-logind[783]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 26 09:59:25 compute-1 systemd-logind[783]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 26 09:59:25 compute-1 lvm[209947]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 26 09:59:25 compute-1 lvm[209947]: VG ceph_vg0 finished
Jan 26 09:59:26 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 09:59:26 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 26 09:59:26 compute-1 systemd[1]: Reloading.
Jan 26 09:59:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:26 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:26 compute-1 systemd-rc-local-generator[209999]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:59:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000058s ======
Jan 26 09:59:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:26.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000058s
Jan 26 09:59:26 compute-1 systemd-sysv-generator[210002]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:59:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:26.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:26 compute-1 ceph-mon[80107]: pgmap v508: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:59:26 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 09:59:26 compute-1 sudo[209824]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:26 compute-1 sshd-session[209909]: Connection closed by authenticating user root 152.42.136.110 port 60892 [preauth]
Jan 26 09:59:27 compute-1 sudo[211161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqmlchrveekhrhakprlzlgkxebkaxpdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421567.0676205-1168-235751754361665/AnsiballZ_systemd_service.py'
Jan 26 09:59:27 compute-1 sudo[211161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:27 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:27 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:27 compute-1 python3.9[211187]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 09:59:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:59:27 compute-1 systemd[1]: Stopping Open-iSCSI...
Jan 26 09:59:27 compute-1 iscsid[204682]: iscsid shutting down.
Jan 26 09:59:27 compute-1 systemd[1]: iscsid.service: Deactivated successfully.
Jan 26 09:59:27 compute-1 systemd[1]: Stopped Open-iSCSI.
Jan 26 09:59:27 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 26 09:59:27 compute-1 systemd[1]: Starting Open-iSCSI...
Jan 26 09:59:27 compute-1 systemd[1]: Started Open-iSCSI.
Jan 26 09:59:27 compute-1 sudo[211161]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:28 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b140016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:28 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 09:59:28 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 09:59:28 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.886s CPU time.
Jan 26 09:59:28 compute-1 systemd[1]: run-ra0912b71835648ffa13a1280f2905ef2.service: Deactivated successfully.
Jan 26 09:59:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:28.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:28.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:28 compute-1 ceph-mon[80107]: pgmap v509: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:59:28 compute-1 sudo[211455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqebsgdnmqxtlwuonrwuxavxatedjqzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421568.189316-1192-42202929324241/AnsiballZ_systemd_service.py'
Jan 26 09:59:28 compute-1 sudo[211455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:28 compute-1 python3.9[211457]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 09:59:28 compute-1 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 26 09:59:28 compute-1 multipathd[208712]: exit (signal)
Jan 26 09:59:28 compute-1 multipathd[208712]: --------shut down-------
Jan 26 09:59:28 compute-1 systemd[1]: multipathd.service: Deactivated successfully.
Jan 26 09:59:28 compute-1 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 26 09:59:28 compute-1 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 26 09:59:28 compute-1 podman[211459]: 2026-01-26 09:59:28.970719313 +0000 UTC m=+0.074856301 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 09:59:29 compute-1 multipathd[211483]: --------start up--------
Jan 26 09:59:29 compute-1 multipathd[211483]: read /etc/multipath.conf
Jan 26 09:59:29 compute-1 multipathd[211483]: path checkers start up
Jan 26 09:59:29 compute-1 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 26 09:59:29 compute-1 sudo[211455]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:29 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:29 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:30 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:30 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:30 compute-1 python3.9[211641]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 09:59:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:30.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000058s ======
Jan 26 09:59:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:30.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000058s
Jan 26 09:59:30 compute-1 ceph-mon[80107]: pgmap v510: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:59:31 compute-1 sudo[211795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uplsbqcbaumraghukiwvxtwpszzklnje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421570.7915258-1244-71157509756199/AnsiballZ_file.py'
Jan 26 09:59:31 compute-1 sudo[211795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:31 compute-1 python3.9[211797]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:59:31 compute-1 sudo[211795]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:31 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:31 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:32 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000058s ======
Jan 26 09:59:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:32.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000058s
Jan 26 09:59:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:32.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:32 compute-1 sudo[211948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwqcgyzrztrgjiaspvsutusbzhgjxvdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421571.9345484-1277-177166080674725/AnsiballZ_systemd_service.py'
Jan 26 09:59:32 compute-1 sudo[211948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:32 compute-1 ceph-mon[80107]: pgmap v511: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 26 09:59:32 compute-1 python3.9[211950]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 09:59:32 compute-1 systemd[1]: Reloading.
Jan 26 09:59:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:59:32 compute-1 systemd-rc-local-generator[211977]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 09:59:32 compute-1 systemd-sysv-generator[211981]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 09:59:32 compute-1 sudo[211948]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 09:59:33 compute-1 python3.9[212136]: ansible-ansible.builtin.service_facts Invoked
Jan 26 09:59:33 compute-1 network[212153]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 09:59:33 compute-1 network[212154]: 'network-scripts' will be removed from distribution in near future.
Jan 26 09:59:33 compute-1 network[212155]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 09:59:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:34 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000058s ======
Jan 26 09:59:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:34.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000058s
Jan 26 09:59:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:34.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:34 compute-1 ceph-mon[80107]: pgmap v512: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 09:59:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:59:34 compute-1 sudo[212161]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 09:59:34 compute-1 sudo[212161]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:59:34 compute-1 sudo[212161]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:34 compute-1 sudo[212189]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 09:59:34 compute-1 sudo[212189]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:59:35 compute-1 sshd-session[212185]: Invalid user ubuntu from 104.248.198.71 port 50012
Jan 26 09:59:35 compute-1 sshd-session[212185]: Connection closed by invalid user ubuntu 104.248.198.71 port 50012 [preauth]
Jan 26 09:59:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:35 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:35 compute-1 sudo[212189]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:35 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:36 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:36.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000059s ======
Jan 26 09:59:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:36.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000059s
Jan 26 09:59:36 compute-1 ceph-mon[80107]: pgmap v513: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 597 B/s wr, 2 op/s
Jan 26 09:59:36 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:59:36 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:59:36 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:59:36 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 09:59:36 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:59:36 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:59:36 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 09:59:36 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 09:59:36 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 09:59:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:36 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 09:59:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:36 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:59:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:36 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 09:59:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:37 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:37 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 26 09:59:37 compute-1 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 26 09:59:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:37 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:59:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:38 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000059s ======
Jan 26 09:59:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:38.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000059s
Jan 26 09:59:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:38.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:38 compute-1 ceph-mon[80107]: pgmap v514: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 597 B/s wr, 2 op/s
Jan 26 09:59:39 compute-1 sudo[212386]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:59:39 compute-1 sudo[212386]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:59:39 compute-1 sudo[212386]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 09:59:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:40 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b30002c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:40.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:40.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:40 compute-1 ceph-mon[80107]: pgmap v515: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 597 B/s wr, 2 op/s
Jan 26 09:59:40 compute-1 sudo[212413]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 09:59:40 compute-1 sudo[212413]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:59:40 compute-1 sudo[212413]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:41 compute-1 sudo[212563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbjyiyhuidmexrckskwgelduvfofahda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421580.759396-1334-52161844490356/AnsiballZ_systemd_service.py'
Jan 26 09:59:41 compute-1 sudo[212563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:41 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:41 compute-1 python3.9[212565]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:59:41 compute-1 sudo[212563]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:41 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:59:41 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 09:59:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:41 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:41 compute-1 sudo[212717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzywxdyhxgrwvwksalpkuxogpzgognxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421581.6410346-1334-139775737762783/AnsiballZ_systemd_service.py'
Jan 26 09:59:41 compute-1 sudo[212717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:42 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:42 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000059s ======
Jan 26 09:59:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:42.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000059s
Jan 26 09:59:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:42.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:42 compute-1 python3.9[212719]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:59:42 compute-1 sudo[212717]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:42 compute-1 ceph-mon[80107]: pgmap v516: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:59:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:59:42 compute-1 sudo[212872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prgrbfpylzzyaqynuhwmesxphcfheuys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421582.491457-1334-248925937176992/AnsiballZ_systemd_service.py'
Jan 26 09:59:42 compute-1 sudo[212872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:43 compute-1 python3.9[212874]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:59:43 compute-1 sudo[212872]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:43 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b30002c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:43 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:43 compute-1 sudo[213026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unbzepyteueafalbzovlxvmxgvfmghba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421583.3729546-1334-105572217838443/AnsiballZ_systemd_service.py'
Jan 26 09:59:43 compute-1 sudo[213026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:43 compute-1 ceph-mon[80107]: pgmap v517: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:59:43 compute-1 python3.9[213028]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:59:43 compute-1 sshd-session[212722]: Connection closed by authenticating user root 178.62.249.31 port 60074 [preauth]
Jan 26 09:59:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:44 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:44.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:44.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:45 compute-1 sudo[213026]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:45 compute-1 podman[213077]: 2026-01-26 09:59:45.362464126 +0000 UTC m=+0.131137573 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 26 09:59:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095945 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 09:59:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:45 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:45 compute-1 sudo[213206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edxbyfmqbyahfqnkooanjjozzfnozypn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421585.1910064-1334-255577212942405/AnsiballZ_systemd_service.py'
Jan 26 09:59:45 compute-1 sudo[213206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:45 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b30002c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:45 compute-1 python3.9[213208]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:59:45 compute-1 sudo[213206]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:46 compute-1 ceph-mon[80107]: pgmap v518: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.6 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 09:59:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:46.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000059s ======
Jan 26 09:59:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:46.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000059s
Jan 26 09:59:46 compute-1 sudo[213359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuktbhlftsvlgksqumwuepcpoqclwmko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421586.0137448-1334-259554953725069/AnsiballZ_systemd_service.py'
Jan 26 09:59:46 compute-1 sudo[213359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:46 compute-1 python3.9[213361]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:59:46 compute-1 sudo[213359]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:47 compute-1 sudo[213512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egfcatsapvmudfuminivkcwtubelfflw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421586.91729-1334-217863610202598/AnsiballZ_systemd_service.py'
Jan 26 09:59:47 compute-1 sudo[213512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:47 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:47 compute-1 python3.9[213514]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:59:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:47 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:59:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:48 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:48.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:48.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:48 compute-1 ceph-mon[80107]: pgmap v519: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 09:59:48 compute-1 sudo[213512]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:49 compute-1 sudo[213666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iitarvkgwofzgvexwifybcpbklijgxbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421588.669255-1334-276529280710958/AnsiballZ_systemd_service.py'
Jan 26 09:59:49 compute-1 sudo[213666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 09:59:49 compute-1 python3.9[213668]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 09:59:49 compute-1 sudo[213666]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:49 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:49 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:50 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:50 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:50 compute-1 sudo[213820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trcpkhbyzyttpsbdqcobdfdxjphqijjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421589.8412936-1511-83283661783747/AnsiballZ_file.py'
Jan 26 09:59:50 compute-1 sudo[213820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000058s ======
Jan 26 09:59:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:50.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000058s
Jan 26 09:59:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:50.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:50 compute-1 ceph-mon[80107]: pgmap v520: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 09:59:50 compute-1 python3.9[213822]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:59:50 compute-1 sudo[213820]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:50 compute-1 sudo[213972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrqpezdypbntlviaakynehwxhipfvkdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421590.5543234-1511-72027930892048/AnsiballZ_file.py'
Jan 26 09:59:50 compute-1 sudo[213972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:51 compute-1 python3.9[213974]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:59:51 compute-1 sudo[213972]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:51 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:51 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:51 compute-1 sudo[214127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azolmffgihbldzgktlikuvwpewbfprxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421591.4051049-1511-131515312120535/AnsiballZ_file.py'
Jan 26 09:59:51 compute-1 sudo[214127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:51 compute-1 python3.9[214129]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:59:51 compute-1 sudo[214127]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:52 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000058s ======
Jan 26 09:59:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:52.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000058s
Jan 26 09:59:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:52.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:52 compute-1 ceph-mon[80107]: pgmap v521: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 426 B/s wr, 2 op/s
Jan 26 09:59:52 compute-1 sudo[214279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-givwqmetdxvongitiappcgkhbrnvpqxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421592.0775552-1511-192379476419722/AnsiballZ_file.py'
Jan 26 09:59:52 compute-1 sudo[214279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:52 compute-1 python3.9[214281]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:59:52 compute-1 sudo[214279]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:59:53 compute-1 sudo[214431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqrnvmjadvqktuypuxbxbkerqwqfgaon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421592.780978-1511-132318018842311/AnsiballZ_file.py'
Jan 26 09:59:53 compute-1 sudo[214431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:53 compute-1 python3.9[214433]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:59:53 compute-1 sudo[214431]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:53 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b1c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:53 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:53 compute-1 sudo[214584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcbgrgbkmemaxnjjkgvlfwcqtfqsiwod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421593.4913278-1511-246877524065359/AnsiballZ_file.py'
Jan 26 09:59:53 compute-1 sudo[214584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:59:53.921 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:59:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:59:53.922 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:59:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 09:59:53.922 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:59:54 compute-1 python3.9[214586]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:59:54 compute-1 sudo[214584]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:54 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:54.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:54.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:54 compute-1 ceph-mon[80107]: pgmap v522: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 26 09:59:54 compute-1 sudo[214736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-litnmdgdovzsxbpbkixaikrkpjszvaip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421594.146431-1511-170826488952590/AnsiballZ_file.py'
Jan 26 09:59:54 compute-1 sudo[214736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:54 compute-1 python3.9[214738]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:59:54 compute-1 sudo[214736]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:55 compute-1 sudo[214888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfzzbygytcegbdljrmyzppdfrpoebrca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421594.7695613-1511-132604451852581/AnsiballZ_file.py'
Jan 26 09:59:55 compute-1 sudo[214888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:55 compute-1 python3.9[214890]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:59:55 compute-1 sudo[214888]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:55 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:55 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b1c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:56 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:59:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:56.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:59:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:56.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:56 compute-1 ceph-mon[80107]: pgmap v523: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 26 09:59:56 compute-1 sudo[215041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slhiyjjhqmkqkwqvrnbzhopaycwgsiax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421596.3799026-1682-41150093353209/AnsiballZ_file.py'
Jan 26 09:59:56 compute-1 sudo[215041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:56 compute-1 python3.9[215043]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:59:56 compute-1 sudo[215041]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:57 compute-1 sudo[215193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwuxzcystctgjlqfnbbgfxargiwzsvib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421597.0820441-1682-240527965625392/AnsiballZ_file.py'
Jan 26 09:59:57 compute-1 sudo[215193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:57 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:57 compute-1 python3.9[215195]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:59:57 compute-1 sudo[215193]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:57 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 09:59:58 compute-1 sudo[215346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-poeajccoeyibxfviqrvhhttfhzhrdque ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421597.6952338-1682-199559316220377/AnsiballZ_file.py'
Jan 26 09:59:58 compute-1 sudo[215346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:58 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b1c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:58 compute-1 python3.9[215348]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:59:58 compute-1 sudo[215346]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 09:59:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:58.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 09:59:58 compute-1 ceph-mon[80107]: pgmap v524: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 09:59:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 09:59:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 09:59:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:58.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 09:59:58 compute-1 sudo[215498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uithqfitpacnjedtmsuirxagewvwysbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421598.3753402-1682-42183379629431/AnsiballZ_file.py'
Jan 26 09:59:58 compute-1 sudo[215498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:58 compute-1 python3.9[215500]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:59:58 compute-1 sudo[215498]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:59 compute-1 podman[215624]: 2026-01-26 09:59:59.263438291 +0000 UTC m=+0.048495525 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 26 09:59:59 compute-1 sudo[215667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vupmyflljhddciiloqtpqzzidztoiumz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421598.9548368-1682-212499655843499/AnsiballZ_file.py'
Jan 26 09:59:59 compute-1 sudo[215667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:59:59 compute-1 sudo[215672]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 09:59:59 compute-1 sudo[215672]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 09:59:59 compute-1 sudo[215672]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:59 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 09:59:59 compute-1 python3.9[215671]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 09:59:59 compute-1 sudo[215667]: pam_unix(sudo:session): session closed for user root
Jan 26 09:59:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:59 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:00 compute-1 sudo[215847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwkyjfyhfirlvfwlnulwanmuggpkunpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421599.5887527-1682-84709630122836/AnsiballZ_file.py'
Jan 26 10:00:00 compute-1 sudo[215847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:00 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:00.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:00.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:00 compute-1 python3.9[215849]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 10:00:00 compute-1 sudo[215847]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:00 compute-1 ceph-mon[80107]: pgmap v525: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:00:00 compute-1 ceph-mon[80107]: overall HEALTH_OK
Jan 26 10:00:00 compute-1 sudo[215999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmtcvujwneioblryyplrsynnwefqqdbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421600.4367707-1682-141677834253246/AnsiballZ_file.py'
Jan 26 10:00:00 compute-1 sudo[215999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:00 compute-1 python3.9[216001]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 10:00:00 compute-1 sudo[215999]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:01 compute-1 sudo[216151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxwqyrluwsljsejxxqyxoppsldxebpjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421601.0759122-1682-226368763436236/AnsiballZ_file.py'
Jan 26 10:00:01 compute-1 sudo[216151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:01 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b1c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:01 compute-1 python3.9[216153]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 10:00:01 compute-1 sudo[216151]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:01 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:01 compute-1 ceph-mon[80107]: pgmap v526: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 26 10:00:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:02 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:00:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:02.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:00:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:02.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:00:02 compute-1 sudo[216304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tviunrtjotpkbjdgrgrvqdbayknarwov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421602.4963439-1856-60247015210391/AnsiballZ_command.py'
Jan 26 10:00:02 compute-1 sudo[216304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:02 compute-1 python3.9[216306]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 10:00:02 compute-1 sudo[216304]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:03 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:03 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b1c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:03 compute-1 python3.9[216459]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 10:00:04 compute-1 ceph-mon[80107]: pgmap v527: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:00:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:00:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:04 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:04.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:04.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:04 compute-1 sudo[216609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spsiqhwgmavkxchwwthdxmounbruhtds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421604.2621627-1910-254836438064177/AnsiballZ_systemd_service.py'
Jan 26 10:00:04 compute-1 sudo[216609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:04 compute-1 python3.9[216611]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 10:00:04 compute-1 systemd[1]: Reloading.
Jan 26 10:00:05 compute-1 systemd-rc-local-generator[216638]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 10:00:05 compute-1 systemd-sysv-generator[216642]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 10:00:05 compute-1 sudo[216609]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:05 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:05 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:05 compute-1 sudo[216797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtqoslgrgwmkrktckpmhoiibqdrzuwmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421605.5095797-1934-116195238699276/AnsiballZ_command.py'
Jan 26 10:00:05 compute-1 sudo[216797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:05 compute-1 python3.9[216799]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 10:00:05 compute-1 sudo[216797]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:06 compute-1 ceph-mon[80107]: pgmap v528: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.110889) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421606110967, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 941, "num_deletes": 251, "total_data_size": 2175388, "memory_usage": 2212640, "flush_reason": "Manual Compaction"}
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Jan 26 10:00:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:06 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421606126954, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 1418430, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 19804, "largest_seqno": 20739, "table_properties": {"data_size": 1414143, "index_size": 2003, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9454, "raw_average_key_size": 19, "raw_value_size": 1405560, "raw_average_value_size": 2898, "num_data_blocks": 90, "num_entries": 485, "num_filter_entries": 485, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769421533, "oldest_key_time": 1769421533, "file_creation_time": 1769421606, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 16142 microseconds, and 8230 cpu microseconds.
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.127041) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 1418430 bytes OK
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.127068) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.128506) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.128535) EVENT_LOG_v1 {"time_micros": 1769421606128528, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.128578) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 2170674, prev total WAL file size 2170674, number of live WAL files 2.
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.129679) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(1385KB)], [36(13MB)]
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421606129773, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 15250410, "oldest_snapshot_seqno": -1}
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 4971 keys, 13068717 bytes, temperature: kUnknown
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421606207233, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 13068717, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13034209, "index_size": 20958, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12485, "raw_key_size": 126769, "raw_average_key_size": 25, "raw_value_size": 12942829, "raw_average_value_size": 2603, "num_data_blocks": 861, "num_entries": 4971, "num_filter_entries": 4971, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769421606, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.207503) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 13068717 bytes
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.208930) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 196.6 rd, 168.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 13.2 +0.0 blob) out(12.5 +0.0 blob), read-write-amplify(20.0) write-amplify(9.2) OK, records in: 5487, records dropped: 516 output_compression: NoCompression
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.208960) EVENT_LOG_v1 {"time_micros": 1769421606208947, "job": 20, "event": "compaction_finished", "compaction_time_micros": 77586, "compaction_time_cpu_micros": 24848, "output_level": 6, "num_output_files": 1, "total_output_size": 13068717, "num_input_records": 5487, "num_output_records": 4971, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421606209509, "job": 20, "event": "table_file_deletion", "file_number": 38}
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421606213869, "job": 20, "event": "table_file_deletion", "file_number": 36}
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.129596) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.213950) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.213955) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.213957) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.213958) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:00:06 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.213960) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:00:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:06.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:06.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:06 compute-1 sudo[216950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hresafulpdrxttfuuxovopcrntsedqir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421606.1411636-1934-173635019549077/AnsiballZ_command.py'
Jan 26 10:00:06 compute-1 sudo[216950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:06 compute-1 python3.9[216952]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 10:00:06 compute-1 sudo[216950]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:07 compute-1 sudo[217103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gauqbbpghcxaonqgzrjfrkaqngoeyueq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421606.7808821-1934-46526637101462/AnsiballZ_command.py'
Jan 26 10:00:07 compute-1 sudo[217103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:07 compute-1 python3.9[217105]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 10:00:07 compute-1 sudo[217103]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:07 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:07 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:00:07 compute-1 sudo[217257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckyfyulyooxssvzjartfrnbppzoilfcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421607.4725964-1934-52842823292502/AnsiballZ_command.py'
Jan 26 10:00:07 compute-1 sudo[217257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:07 compute-1 python3.9[217259]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 10:00:08 compute-1 sudo[217257]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:08 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:08 compute-1 ceph-mon[80107]: pgmap v529: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:00:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:08.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:08.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:08 compute-1 sudo[217410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udduhzkpivfblmominrjaytzkcldcbgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421608.1403303-1934-164983784151409/AnsiballZ_command.py'
Jan 26 10:00:08 compute-1 sudo[217410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:08 compute-1 python3.9[217412]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 10:00:08 compute-1 sudo[217410]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:09 compute-1 sudo[217565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkddikullkjpjsihmzzbhhntqxzexdpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421608.7580855-1934-31553349382560/AnsiballZ_command.py'
Jan 26 10:00:09 compute-1 sudo[217565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:09 compute-1 python3.9[217567]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 10:00:09 compute-1 sudo[217565]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:09 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:09 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:09 compute-1 sudo[217719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmqxidejstlmookyxvenrgnmjmgpphcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421609.4462717-1934-117990585207772/AnsiballZ_command.py'
Jan 26 10:00:09 compute-1 sudo[217719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:09 compute-1 python3.9[217721]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 10:00:09 compute-1 sudo[217719]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:10 compute-1 sshd-session[217413]: Connection closed by authenticating user root 152.42.136.110 port 41620 [preauth]
Jan 26 10:00:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:10 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:10 compute-1 ceph-mon[80107]: pgmap v530: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:00:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:10.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:10.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:10 compute-1 sudo[217872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiydocozjgedfqqazagtcevcrfmumqvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421610.0531225-1934-281158362186941/AnsiballZ_command.py'
Jan 26 10:00:10 compute-1 sudo[217872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:10 compute-1 python3.9[217874]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 10:00:10 compute-1 sudo[217872]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:11 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:11 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:12 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:12 compute-1 ceph-mon[80107]: pgmap v531: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 26 10:00:12 compute-1 sudo[218026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpthjrwlbzumbyaeevbaisexussgokus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421611.9808884-2141-103840130953784/AnsiballZ_file.py'
Jan 26 10:00:12 compute-1 sudo[218026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:12.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:12.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:12 compute-1 python3.9[218028]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 10:00:12 compute-1 sudo[218026]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:00:12 compute-1 sudo[218178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqymrpjzpoyqnxvzvmtraakqkyfgymfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421612.6147008-2141-212578986583976/AnsiballZ_file.py'
Jan 26 10:00:12 compute-1 sudo[218178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:13 compute-1 python3.9[218180]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 10:00:13 compute-1 sudo[218178]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:13 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:13 compute-1 sudo[218331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvnerkjdkugrtiblqredapxjlefjqhhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421613.171205-2141-196413606435435/AnsiballZ_file.py'
Jan 26 10:00:13 compute-1 sudo[218331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:13 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b1c003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:13 compute-1 python3.9[218333]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 10:00:13 compute-1 sudo[218331]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:14 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:14 compute-1 ceph-mon[80107]: pgmap v532: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:00:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:14.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:14.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:14 compute-1 sudo[218483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfiylzuzoewvbwqyobziteoojxqqzjuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421614.355497-2207-77200863977871/AnsiballZ_file.py'
Jan 26 10:00:14 compute-1 sudo[218483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:14 compute-1 python3.9[218485]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 10:00:14 compute-1 sudo[218483]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:15 compute-1 sudo[218635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncsagmafrudpzdrpwfcbyntuwaqefvqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421615.0087392-2207-197090075883217/AnsiballZ_file.py'
Jan 26 10:00:15 compute-1 sudo[218635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:15 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:15 compute-1 python3.9[218637]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 10:00:15 compute-1 sudo[218635]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:15 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:15 compute-1 podman[218638]: 2026-01-26 10:00:15.672074828 +0000 UTC m=+0.173757128 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller)
Jan 26 10:00:15 compute-1 sudo[218815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcmybqciqihljrqorcsuvkkkouswjszr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421615.6675558-2207-48779663069972/AnsiballZ_file.py'
Jan 26 10:00:15 compute-1 sudo[218815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:16 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:16 compute-1 python3.9[218817]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 10:00:16 compute-1 sudo[218815]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:16.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:00:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:16 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:16.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:16 compute-1 ceph-mon[80107]: pgmap v533: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 10:00:16 compute-1 sudo[218968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgdducioufdntwzgyyrrmjjhmuqjtzlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421616.2906654-2207-213987067499817/AnsiballZ_file.py'
Jan 26 10:00:16 compute-1 sudo[218968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:16 compute-1 python3.9[218970]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 10:00:16 compute-1 sudo[218968]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:17 compute-1 sudo[219120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfoedfhcslxljzwogsppwggwneongqbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421616.9761426-2207-138116704299476/AnsiballZ_file.py'
Jan 26 10:00:17 compute-1 sudo[219120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:17 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:17 compute-1 python3.9[219122]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 10:00:17 compute-1 sudo[219120]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:17 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:17 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:00:17 compute-1 sudo[219273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juelptsrqystlonqvsgzuddndklmezqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421617.653788-2207-58715904566265/AnsiballZ_file.py'
Jan 26 10:00:17 compute-1 sudo[219273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:18 compute-1 python3.9[219275]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 10:00:18 compute-1 sudo[219273]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:18 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:18.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:18.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:18 compute-1 sudo[219425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxyldjoyajiwuyttztijxhooummedmcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421618.2422147-2207-150481952875152/AnsiballZ_file.py'
Jan 26 10:00:18 compute-1 sudo[219425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:18 compute-1 ceph-mon[80107]: pgmap v534: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:00:18 compute-1 python3.9[219427]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 10:00:18 compute-1 sudo[219425]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:19 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b1c003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:19 compute-1 sudo[219452]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:00:19 compute-1 sudo[219452]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:00:19 compute-1 sudo[219452]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:00:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:19 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:20 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:00:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:20 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:20.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:20.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:20 compute-1 ceph-mon[80107]: pgmap v535: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:00:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:21 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:21 compute-1 ceph-mon[80107]: pgmap v536: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 26 10:00:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:21 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b1c003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:21 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 10:00:21 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 6838 writes, 27K keys, 6838 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 6838 writes, 1330 syncs, 5.14 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 522 writes, 785 keys, 522 commit groups, 1.0 writes per commit group, ingest: 0.25 MB, 0.00 MB/s
                                           Interval WAL: 522 writes, 261 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Jan 26 10:00:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:22 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000062s ======
Jan 26 10:00:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:22.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000062s
Jan 26 10:00:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:00:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:22 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:22.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:00:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100023 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 10:00:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:23 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:23 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:24 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b1c003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:24 compute-1 ceph-mon[80107]: pgmap v537: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:00:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:00:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:24.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000061s ======
Jan 26 10:00:24 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:24.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000061s
Jan 26 10:00:24 compute-1 sudo[219605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzmdzxtcuasrsvztlicfazeoplxittjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421624.5421376-2532-220041496420006/AnsiballZ_getent.py'
Jan 26 10:00:24 compute-1 sudo[219605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:25 compute-1 python3.9[219607]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 26 10:00:25 compute-1 sudo[219605]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:25 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:25 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:25 compute-1 sshd-session[219633]: Invalid user ubuntu from 104.248.198.71 port 57602
Jan 26 10:00:26 compute-1 sudo[219761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvernydormbypbneiguejudghiyvbmdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421625.5764468-2556-227407925514371/AnsiballZ_group.py'
Jan 26 10:00:26 compute-1 sudo[219761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:26 compute-1 sshd-session[219633]: Connection closed by invalid user ubuntu 104.248.198.71 port 57602 [preauth]
Jan 26 10:00:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:26 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:26 compute-1 ceph-mon[80107]: pgmap v538: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:00:26 compute-1 python3.9[219763]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 10:00:26 compute-1 groupadd[219764]: group added to /etc/group: name=nova, GID=42436
Jan 26 10:00:26 compute-1 groupadd[219764]: group added to /etc/gshadow: name=nova
Jan 26 10:00:26 compute-1 groupadd[219764]: new group: name=nova, GID=42436
Jan 26 10:00:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000061s ======
Jan 26 10:00:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:26.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000061s
Jan 26 10:00:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:00:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:26 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:26.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:26 compute-1 sudo[219761]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:26 compute-1 sudo[219919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqrdxzbzwowxrscxykujeiwtugrkjvuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421626.5010002-2580-140935754528421/AnsiballZ_user.py'
Jan 26 10:00:26 compute-1 sudo[219919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:27 compute-1 python3.9[219921]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 10:00:27 compute-1 useradd[219923]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Jan 26 10:00:27 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 10:00:27 compute-1 useradd[219923]: add 'nova' to group 'libvirt'
Jan 26 10:00:27 compute-1 useradd[219923]: add 'nova' to shadow group 'libvirt'
Jan 26 10:00:27 compute-1 sudo[219919]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:27 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b1c003430 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:27 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:00:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:28 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:28 compute-1 ceph-mon[80107]: pgmap v539: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 10:00:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:00:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:28.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:28 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:28.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:28 compute-1 sshd-session[219956]: Accepted publickey for zuul from 192.168.122.30 port 35016 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 10:00:29 compute-1 systemd-logind[783]: New session 54 of user zuul.
Jan 26 10:00:29 compute-1 systemd[1]: Started Session 54 of User zuul.
Jan 26 10:00:29 compute-1 sshd-session[219956]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 10:00:29 compute-1 sshd-session[219959]: Received disconnect from 192.168.122.30 port 35016:11: disconnected by user
Jan 26 10:00:29 compute-1 sshd-session[219959]: Disconnected from user zuul 192.168.122.30 port 35016
Jan 26 10:00:29 compute-1 sshd-session[219956]: pam_unix(sshd:session): session closed for user zuul
Jan 26 10:00:29 compute-1 systemd[1]: session-54.scope: Deactivated successfully.
Jan 26 10:00:29 compute-1 systemd-logind[783]: Session 54 logged out. Waiting for processes to exit.
Jan 26 10:00:29 compute-1 systemd-logind[783]: Removed session 54.
Jan 26 10:00:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:29 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18002690 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:29 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b1c003430 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:29 compute-1 podman[220084]: 2026-01-26 10:00:29.726212333 +0000 UTC m=+0.053086915 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 26 10:00:29 compute-1 python3.9[220127]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 10:00:30 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:30 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:30 compute-1 ceph-mon[80107]: pgmap v540: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 10:00:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000061s ======
Jan 26 10:00:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:30.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000061s
Jan 26 10:00:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:00:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:30 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:30.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:30 compute-1 python3.9[220250]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421629.3873346-2655-38405105932395/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 10:00:30 compute-1 python3.9[220402]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 10:00:31 compute-1 python3.9[220478]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 10:00:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:31 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:31 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:31 compute-1 sshd-session[220298]: Connection closed by authenticating user root 178.62.249.31 port 40896 [preauth]
Jan 26 10:00:32 compute-1 python3.9[220629]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 10:00:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:32 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:00:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:32.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:32 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:32.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:32 compute-1 ceph-mon[80107]: pgmap v541: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 26 10:00:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:32 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:00:32 compute-1 python3.9[220750]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421631.5555131-2655-189613474164202/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 10:00:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:00:33 compute-1 python3.9[220900]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 10:00:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:33 compute-1 python3.9[221022]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421632.828948-2655-13822516807092/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 10:00:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:34 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:00:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000061s ======
Jan 26 10:00:34 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:34.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000061s
Jan 26 10:00:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000061s ======
Jan 26 10:00:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:34.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000061s
Jan 26 10:00:34 compute-1 python3.9[221172]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 10:00:34 compute-1 ceph-mon[80107]: pgmap v542: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 26 10:00:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:00:34 compute-1 python3.9[221293]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421633.929032-2655-205495113716187/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 10:00:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:35 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:00:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:35 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:00:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:35 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:35 compute-1 python3.9[221443]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 10:00:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:35 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:35 compute-1 python3.9[221567]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421635.1166809-2655-211668518473945/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 10:00:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:36 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14002690 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:36.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:36.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:36 compute-1 ceph-mon[80107]: pgmap v543: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Jan 26 10:00:37 compute-1 sudo[221717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zebeyweiermpggqukvokjjgkwpsycxgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421637.190873-2904-161928321996899/AnsiballZ_file.py'
Jan 26 10:00:37 compute-1 sudo[221717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:37 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20004000 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:37 compute-1 python3.9[221719]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 10:00:37 compute-1 sudo[221717]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:37 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:00:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:38 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:38.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:38.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:38 compute-1 sudo[221870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vukuzicznllqjwbgpmyvifotjueqlavu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421638.0619006-2928-147524938887836/AnsiballZ_copy.py'
Jan 26 10:00:38 compute-1 sudo[221870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:38 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 10:00:38 compute-1 python3.9[221872]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 10:00:38 compute-1 sudo[221870]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:38 compute-1 ceph-mon[80107]: pgmap v544: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Jan 26 10:00:39 compute-1 sudo[222022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjseufytvpludcvialomrvjbpszbgtdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421638.7983594-2952-246159393929553/AnsiballZ_stat.py'
Jan 26 10:00:39 compute-1 sudo[222022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:39 compute-1 python3.9[222024]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 10:00:39 compute-1 sudo[222022]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread fragmentation_score=0.000026 took=0.000118s
Jan 26 10:00:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14002690 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:39 compute-1 sudo[222049]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:00:39 compute-1 sudo[222049]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:00:39 compute-1 sudo[222049]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14002690 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:39 compute-1 ceph-mon[80107]: pgmap v545: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Jan 26 10:00:39 compute-1 sudo[222200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxqnopucerwaljkbzjbihrzifslhkbdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421639.5866902-2976-234382837844208/AnsiballZ_stat.py'
Jan 26 10:00:39 compute-1 sudo[222200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:40 compute-1 python3.9[222202]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 10:00:40 compute-1 sudo[222200]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:40 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:40.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:40.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:40 compute-1 sudo[222323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvvhqogumcllgfieadgoyofdkqzoccro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421639.5866902-2976-234382837844208/AnsiballZ_copy.py'
Jan 26 10:00:40 compute-1 sudo[222323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:40 compute-1 python3.9[222325]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769421639.5866902-2976-234382837844208/.source _original_basename=.dqxtjeui follow=False checksum=85aac6417ba27e9322cb5a036dc1fd0f95ce2de4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 26 10:00:40 compute-1 sudo[222323]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:40 compute-1 sudo[222352]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:00:40 compute-1 sudo[222352]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:00:40 compute-1 sudo[222352]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:40 compute-1 sudo[222377]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Jan 26 10:00:40 compute-1 sudo[222377]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:00:41 compute-1 sudo[222377]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:41 compute-1 sudo[222422]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:00:41 compute-1 sudo[222422]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:00:41 compute-1 sudo[222422]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:41 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:41 compute-1 sudo[222463]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:00:41 compute-1 sudo[222463]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:00:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:41 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14002690 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:41 compute-1 python3.9[222598]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 10:00:41 compute-1 sudo[222463]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:42 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:42 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14002690 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:42.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:42.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:42 compute-1 ceph-mon[80107]: pgmap v546: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 10:00:42 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:00:42 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:00:42 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:00:42 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:00:42 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:00:42 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:00:42 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:00:42 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:00:42 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:00:42 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:00:42 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:00:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:00:42 compute-1 python3.9[222781]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 10:00:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:43 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100043 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 10:00:43 compute-1 python3.9[222902]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421642.1911008-3054-222265330781035/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=53b8456782b81b5794d3eef3fadcfb00db1088a8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 10:00:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:43 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:43 compute-1 ceph-mon[80107]: pgmap v547: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 2 op/s
Jan 26 10:00:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:44 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:44.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:44.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:44 compute-1 python3.9[223053]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 10:00:45 compute-1 python3.9[223174]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421644.2105863-3099-112455315419283/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=0333d3a3f5c3a0526b0ebe430250032166710e8a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 10:00:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:45 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14002690 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:45 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:46 compute-1 ceph-mon[80107]: pgmap v548: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 10:00:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:46 compute-1 sudo[223337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-speqlvivtubsdmprhqdkouxyvzbcyfah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421645.7457793-3150-70866041404084/AnsiballZ_container_config_data.py'
Jan 26 10:00:46 compute-1 sudo[223337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000061s ======
Jan 26 10:00:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:46.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000061s
Jan 26 10:00:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:46.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:46 compute-1 podman[223299]: 2026-01-26 10:00:46.32042889 +0000 UTC m=+0.126948190 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 26 10:00:46 compute-1 python3.9[223346]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 26 10:00:46 compute-1 sudo[223337]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:47 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:47 compute-1 sudo[223503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quhvpkrjtikysnmeubikcgsccaeinkuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421647.1799724-3183-95561775018778/AnsiballZ_container_config_hash.py'
Jan 26 10:00:47 compute-1 sudo[223503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:47 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14002690 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:00:47 compute-1 python3.9[223505]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 10:00:47 compute-1 sudo[223503]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:48 compute-1 ceph-mon[80107]: pgmap v549: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 10:00:48 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:00:48 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:00:48 compute-1 sudo[223530]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:00:48 compute-1 sudo[223530]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:00:48 compute-1 sudo[223530]: pam_unix(sudo:session): session closed for user root
Jan 26 10:00:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:48 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20004340 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:48.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:48.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:48 compute-1 sudo[223680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfltspgzmcpmqiajuevuybolzyxstomb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769421648.3824086-3213-90667556207450/AnsiballZ_edpm_container_manage.py'
Jan 26 10:00:48 compute-1 sudo[223680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:00:49 compute-1 python3[223682]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 10:00:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:00:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:49 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:49 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:50 compute-1 ceph-mon[80107]: pgmap v550: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 10:00:50 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:50 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14003e10 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:50.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:50.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:51 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20004340 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:51 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:52 compute-1 sshd-session[223719]: Connection closed by authenticating user root 152.42.136.110 port 43596 [preauth]
Jan 26 10:00:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:52 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:52.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:52.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:00:53 compute-1 ceph-mon[80107]: pgmap v551: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 426 B/s wr, 2 op/s
Jan 26 10:00:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:53 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:53 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20004340 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:00:53.922 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:00:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:00:53.923 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:00:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:00:53.923 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:00:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:54 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ec0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:54.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:54.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:54 compute-1 ceph-mon[80107]: pgmap v552: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 26 10:00:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:55 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:55 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:56 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20004340 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:56.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:56.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:57 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ee0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:57 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:00:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:58 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:00:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:58.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:00:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:00:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:00:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:58.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:00:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:59 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20004340 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:00:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:59 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003f00 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:00 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003f00 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:00.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:01:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:00.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:01:00 compute-1 sudo[223768]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:01:00 compute-1 sudo[223768]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:01:00 compute-1 sudo[223768]: pam_unix(sudo:session): session closed for user root
Jan 26 10:01:01 compute-1 podman[223791]: 2026-01-26 10:01:01.288894876 +0000 UTC m=+1.059005322 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 26 10:01:01 compute-1 ceph-mon[80107]: pgmap v553: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 26 10:01:01 compute-1 podman[223695]: 2026-01-26 10:01:01.399978048 +0000 UTC m=+12.270399925 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Jan 26 10:01:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:01 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003f00 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:01 compute-1 podman[223836]: 2026-01-26 10:01:01.534455028 +0000 UTC m=+0.051911507 container create f8473b9a2106532a3892fa37e60a6e60a0eb71b6d886197e161aaec5fb0ab5d9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 26 10:01:01 compute-1 podman[223836]: 2026-01-26 10:01:01.505002925 +0000 UTC m=+0.022459424 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Jan 26 10:01:01 compute-1 python3[223682]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 26 10:01:01 compute-1 sudo[223680]: pam_unix(sudo:session): session closed for user root
Jan 26 10:01:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:01 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20004340 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:01 compute-1 CROND[223898]: (root) CMD (run-parts /etc/cron.hourly)
Jan 26 10:01:01 compute-1 run-parts[223902]: (/etc/cron.hourly) starting 0anacron
Jan 26 10:01:01 compute-1 run-parts[223908]: (/etc/cron.hourly) finished 0anacron
Jan 26 10:01:01 compute-1 CROND[223895]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 26 10:01:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:02 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003f00 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:02 compute-1 ceph-mon[80107]: pgmap v554: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:01:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:02 compute-1 ceph-mon[80107]: pgmap v555: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:01:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:02.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:02 compute-1 ceph-mon[80107]: pgmap v556: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 26 10:01:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:01:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:02.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:01:02 compute-1 sudo[224034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aswivqmblmngqrqgqhxckekschmpdbbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421662.124286-3237-193904469213743/AnsiballZ_stat.py'
Jan 26 10:01:02 compute-1 sudo[224034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:01:02 compute-1 python3.9[224036]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 10:01:02 compute-1 sudo[224034]: pam_unix(sudo:session): session closed for user root
Jan 26 10:01:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:01:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:03 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14003e10 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:03 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:03 compute-1 sudo[224189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwrskizbfctvpbyoiidkfdoidrhveylg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421663.4671948-3274-241340895920743/AnsiballZ_container_config_data.py'
Jan 26 10:01:03 compute-1 sudo[224189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:01:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:04 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20004340 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:04 compute-1 ceph-mon[80107]: pgmap v557: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:01:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:04.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:01:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:04.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:01:04 compute-1 python3.9[224191]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 26 10:01:04 compute-1 sudo[224189]: pam_unix(sudo:session): session closed for user root
Jan 26 10:01:05 compute-1 sudo[224341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhaahexpwbbglskjvloioxglmtycjycc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421664.7719827-3306-216644710689183/AnsiballZ_container_config_hash.py'
Jan 26 10:01:05 compute-1 sudo[224341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:01:05 compute-1 python3.9[224343]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 10:01:05 compute-1 sudo[224341]: pam_unix(sudo:session): session closed for user root
Jan 26 10:01:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:05 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003f20 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:05 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 10:01:05 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Cumulative writes: 3906 writes, 21K keys, 3906 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.05 MB/s
                                           Cumulative WAL: 3906 writes, 3906 syncs, 1.00 writes per sync, written: 0.05 GB, 0.05 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1448 writes, 6861 keys, 1448 commit groups, 1.0 writes per commit group, ingest: 16.40 MB, 0.03 MB/s
                                           Interval WAL: 1448 writes, 1448 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    118.4      0.27              0.10        10    0.027       0      0       0.0       0.0
                                             L6      1/0   12.46 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.5    157.9    134.3      0.85              0.25         9    0.095     43K   4846       0.0       0.0
                                            Sum      1/0   12.46 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.5    119.4    130.5      1.13              0.35        19    0.059     43K   4846       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   5.3    135.1    135.5      0.47              0.16         8    0.059     22K   2570       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    157.9    134.3      0.85              0.25         9    0.095     43K   4846       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    119.5      0.27              0.10         9    0.030       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.032, interval 0.012
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.14 GB write, 0.12 MB/s write, 0.13 GB read, 0.11 MB/s read, 1.1 seconds
                                           Interval compaction: 0.06 GB write, 0.11 MB/s write, 0.06 GB read, 0.11 MB/s read, 0.5 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55af2cb209b0#2 capacity: 304.00 MB usage: 8.34 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000192 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(461,7.96 MB,2.61993%) FilterBlock(19,127.67 KB,0.041013%) IndexBlock(19,252.55 KB,0.0811276%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Jan 26 10:01:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:05 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14003e10 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:05 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:01:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:06 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:01:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:06.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:01:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:01:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:06.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:01:06 compute-1 ceph-mon[80107]: pgmap v558: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 10:01:06 compute-1 sudo[224494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opkfxriaivfwhzotpjyntwlwsplnlkit ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769421666.6391728-3336-208643275367864/AnsiballZ_edpm_container_manage.py'
Jan 26 10:01:06 compute-1 sudo[224494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:01:07 compute-1 python3[224496]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 10:01:07 compute-1 podman[224534]: 2026-01-26 10:01:07.309579739 +0000 UTC m=+0.057168022 container create e5e330867e7e5da6ef9c6b9af3e2d2c9ff3fe5beb198d40aa1d27e78e35090e6 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.build-date=20251202, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 10:01:07 compute-1 podman[224534]: 2026-01-26 10:01:07.280279749 +0000 UTC m=+0.027868042 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Jan 26 10:01:07 compute-1 python3[224496]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b kolla_start
Jan 26 10:01:07 compute-1 sudo[224494]: pam_unix(sudo:session): session closed for user root
Jan 26 10:01:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:07 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20004360 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:07 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003f40 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:01:07 compute-1 ceph-mon[80107]: pgmap v559: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:01:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:08 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24001090 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:08.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:01:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:08.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:01:08 compute-1 sudo[224726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxjwbpkkxmzyalgkkazkemtypqnktlie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421668.117206-3360-7223076287205/AnsiballZ_stat.py'
Jan 26 10:01:08 compute-1 sudo[224726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:01:08 compute-1 python3.9[224728]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 10:01:08 compute-1 sudo[224726]: pam_unix(sudo:session): session closed for user root
Jan 26 10:01:09 compute-1 sudo[224880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unbvxcrztpjanlrmrabkgdvntqqxnadl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421668.957263-3387-188086952966477/AnsiballZ_file.py'
Jan 26 10:01:09 compute-1 sudo[224880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:01:09 compute-1 python3.9[224882]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 10:01:09 compute-1 sudo[224880]: pam_unix(sudo:session): session closed for user root
Jan 26 10:01:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:09 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:09 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20004360 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:09 compute-1 ceph-mon[80107]: pgmap v560: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:01:09 compute-1 sudo[225032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whnctqbypgtgypligwwusazeldkzqnme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421669.5188947-3387-51787472195700/AnsiballZ_copy.py'
Jan 26 10:01:09 compute-1 sudo[225032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:01:10 compute-1 python3.9[225034]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769421669.5188947-3387-51787472195700/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 10:01:10 compute-1 sudo[225032]: pam_unix(sudo:session): session closed for user root
Jan 26 10:01:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:10 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003f40 fd 14 proxy ignored for local
Jan 26 10:01:10 compute-1 kernel: ganesha.nfsd[204931]: segfault at 50 ip 00007f7bc5e6032e sp 00007f7b70ff8210 error 4 in libntirpc.so.5.8[7f7bc5e45000+2c000] likely on CPU 6 (core 0, socket 6)
Jan 26 10:01:10 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 26 10:01:10 compute-1 systemd[1]: Started Process Core Dump (PID 225035/UID 0).
Jan 26 10:01:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:10.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:10.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:10 compute-1 sudo[225110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-daisbabbqcmuweqvrlpugdwfykccfpcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421669.5188947-3387-51787472195700/AnsiballZ_systemd.py'
Jan 26 10:01:10 compute-1 sudo[225110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:01:10 compute-1 python3.9[225112]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 10:01:10 compute-1 systemd[1]: Reloading.
Jan 26 10:01:10 compute-1 systemd-sysv-generator[225144]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 10:01:10 compute-1 systemd-rc-local-generator[225139]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 10:01:11 compute-1 sudo[225110]: pam_unix(sudo:session): session closed for user root
Jan 26 10:01:11 compute-1 sudo[225222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-toatphvkzoyyygrhgplcicdohuktzmsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421669.5188947-3387-51787472195700/AnsiballZ_systemd.py'
Jan 26 10:01:11 compute-1 sudo[225222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:01:11 compute-1 systemd-coredump[225048]: Process 203358 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 43:
                                                    #0  0x00007f7bc5e6032e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Jan 26 10:01:11 compute-1 systemd[1]: systemd-coredump@7-225035-0.service: Deactivated successfully.
Jan 26 10:01:11 compute-1 systemd[1]: systemd-coredump@7-225035-0.service: Consumed 1.190s CPU time.
Jan 26 10:01:11 compute-1 podman[225230]: 2026-01-26 10:01:11.55532503 +0000 UTC m=+0.028854698 container died cbf27e002fa6e7d390d75e124dda783fcdb8c722ca7000472b152c8413a63b8a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 10:01:11 compute-1 systemd[1]: var-lib-containers-storage-overlay-18b268ab60ff9aff8b27d65bea4ca79576ef17e6e07f8e0cca83a86a786d46ca-merged.mount: Deactivated successfully.
Jan 26 10:01:11 compute-1 podman[225230]: 2026-01-26 10:01:11.604426301 +0000 UTC m=+0.077955949 container remove cbf27e002fa6e7d390d75e124dda783fcdb8c722ca7000472b152c8413a63b8a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 10:01:11 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Main process exited, code=exited, status=139/n/a
Jan 26 10:01:11 compute-1 python3.9[225224]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 10:01:11 compute-1 systemd[1]: Reloading.
Jan 26 10:01:11 compute-1 systemd-rc-local-generator[225301]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 10:01:11 compute-1 systemd-sysv-generator[225305]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 10:01:12 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Failed with result 'exit-code'.
Jan 26 10:01:12 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.924s CPU time.
Jan 26 10:01:12 compute-1 systemd[1]: Starting nova_compute container...
Jan 26 10:01:12 compute-1 systemd[1]: Started libcrun container.
Jan 26 10:01:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc1f2a961f3a254d4d6e926a9d6f4bf5de3b1a727c611bc615442b6a446b988a/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 26 10:01:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc1f2a961f3a254d4d6e926a9d6f4bf5de3b1a727c611bc615442b6a446b988a/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 26 10:01:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc1f2a961f3a254d4d6e926a9d6f4bf5de3b1a727c611bc615442b6a446b988a/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 26 10:01:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc1f2a961f3a254d4d6e926a9d6f4bf5de3b1a727c611bc615442b6a446b988a/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 26 10:01:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc1f2a961f3a254d4d6e926a9d6f4bf5de3b1a727c611bc615442b6a446b988a/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 26 10:01:12 compute-1 podman[225312]: 2026-01-26 10:01:12.149597638 +0000 UTC m=+0.089145243 container init e5e330867e7e5da6ef9c6b9af3e2d2c9ff3fe5beb198d40aa1d27e78e35090e6 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.build-date=20251202, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 10:01:12 compute-1 podman[225312]: 2026-01-26 10:01:12.157688399 +0000 UTC m=+0.097235974 container start e5e330867e7e5da6ef9c6b9af3e2d2c9ff3fe5beb198d40aa1d27e78e35090e6 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm)
Jan 26 10:01:12 compute-1 ceph-mon[80107]: pgmap v561: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 26 10:01:12 compute-1 nova_compute[225328]: + sudo -E kolla_set_configs
Jan 26 10:01:12 compute-1 podman[225312]: nova_compute
Jan 26 10:01:12 compute-1 systemd[1]: Started nova_compute container.
Jan 26 10:01:12 compute-1 sudo[225222]: pam_unix(sudo:session): session closed for user root
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Validating config file
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Copying service configuration files
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Deleting /etc/ceph
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Creating directory /etc/ceph
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Setting permission for /etc/ceph
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Writing out command to execute
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 26 10:01:12 compute-1 nova_compute[225328]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 26 10:01:12 compute-1 nova_compute[225328]: ++ cat /run_command
Jan 26 10:01:12 compute-1 nova_compute[225328]: + CMD=nova-compute
Jan 26 10:01:12 compute-1 nova_compute[225328]: + ARGS=
Jan 26 10:01:12 compute-1 nova_compute[225328]: + sudo kolla_copy_cacerts
Jan 26 10:01:12 compute-1 nova_compute[225328]: + [[ ! -n '' ]]
Jan 26 10:01:12 compute-1 nova_compute[225328]: + . kolla_extend_start
Jan 26 10:01:12 compute-1 nova_compute[225328]: + echo 'Running command: '\''nova-compute'\'''
Jan 26 10:01:12 compute-1 nova_compute[225328]: Running command: 'nova-compute'
Jan 26 10:01:12 compute-1 nova_compute[225328]: + umask 0022
Jan 26 10:01:12 compute-1 nova_compute[225328]: + exec nova-compute
Jan 26 10:01:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:12.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:01:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:12.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:01:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:01:13 compute-1 python3.9[225491]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 10:01:14 compute-1 ceph-mon[80107]: pgmap v562: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:01:14 compute-1 nova_compute[225328]: 2026-01-26 10:01:14.302 225332 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 26 10:01:14 compute-1 nova_compute[225328]: 2026-01-26 10:01:14.302 225332 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 26 10:01:14 compute-1 nova_compute[225328]: 2026-01-26 10:01:14.303 225332 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 26 10:01:14 compute-1 nova_compute[225328]: 2026-01-26 10:01:14.303 225332 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 26 10:01:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:01:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:01:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:14.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:01:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:14 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:14.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:14 compute-1 nova_compute[225328]: 2026-01-26 10:01:14.438 225332 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:01:14 compute-1 nova_compute[225328]: 2026-01-26 10:01:14.462 225332 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:01:14 compute-1 nova_compute[225328]: 2026-01-26 10:01:14.463 225332 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 26 10:01:14 compute-1 python3.9[225645]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.152 225332 INFO nova.virt.driver [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.298 225332 INFO nova.compute.provider_config [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.321 225332 DEBUG oslo_concurrency.lockutils [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.322 225332 DEBUG oslo_concurrency.lockutils [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.322 225332 DEBUG oslo_concurrency.lockutils [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.322 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.323 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.323 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.323 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.323 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.323 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.323 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.324 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.324 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.324 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.324 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.324 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.324 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.325 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.325 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.325 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.325 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.325 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.325 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.326 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.326 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.326 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.326 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.326 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.326 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.327 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.327 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.327 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.327 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.327 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.328 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.328 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.328 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.328 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.328 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.329 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.329 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.329 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.329 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.329 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.330 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.330 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.330 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.330 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.331 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.331 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.331 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.331 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.331 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.331 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.332 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.332 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.332 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.332 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.332 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.333 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.333 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.333 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.333 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.333 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.333 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.333 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.334 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.334 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.334 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.334 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.334 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.334 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.334 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.335 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.335 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.335 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.335 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.335 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.335 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.335 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.336 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.336 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.336 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.336 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.336 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.337 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.337 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.337 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.337 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.337 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.338 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.338 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.338 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.338 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.338 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.338 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.339 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.339 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.339 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.339 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.339 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.339 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.339 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.339 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.340 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.340 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.340 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.340 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.340 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.340 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.341 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.341 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.341 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.341 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.341 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.341 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.342 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.342 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.342 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.342 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.342 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.342 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.343 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.343 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.343 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.343 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.343 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.343 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.344 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.344 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.344 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.344 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.344 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.344 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.345 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.345 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.345 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.345 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.345 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.345 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.345 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.346 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.346 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.346 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.346 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.346 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.346 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.346 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.347 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.347 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.347 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.347 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.347 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.347 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.348 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.348 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.348 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.348 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.348 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.348 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.349 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.349 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.349 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.349 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.349 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.349 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.349 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.350 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.350 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.350 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.350 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.350 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.350 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.351 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.351 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.351 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.351 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.351 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.351 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.352 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.352 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.352 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.352 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.352 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.352 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.353 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.353 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.353 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.353 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.353 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.353 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.354 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.354 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.354 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.354 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.354 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.354 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.355 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.355 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.355 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.355 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.355 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.355 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.356 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.356 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.356 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.356 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.356 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.356 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.356 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.357 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.357 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.357 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.357 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.357 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.357 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.358 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.358 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.358 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.358 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.358 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.358 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.358 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.359 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.359 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.359 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.359 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.359 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.359 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.360 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.360 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.360 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.360 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.360 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.360 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.361 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.361 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.361 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.361 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.361 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.361 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.362 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.362 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.362 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.362 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.362 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.362 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.362 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.363 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.363 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.363 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.363 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.363 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.363 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.363 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.364 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.364 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.364 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.364 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.364 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.364 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.365 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.365 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.365 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.365 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.365 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.365 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.365 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.366 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.366 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.366 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.366 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.366 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.366 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.367 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.367 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.367 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.367 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.367 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.367 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.368 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.368 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.368 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.368 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.368 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.368 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.369 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.369 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.369 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.369 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.369 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.370 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.370 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.370 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.370 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.370 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.370 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.370 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.371 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.371 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.371 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.371 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.371 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.372 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.372 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.372 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.372 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.372 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.373 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.373 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.373 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.373 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.373 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.374 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.374 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.374 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.374 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.374 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.374 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.375 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.375 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.375 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.375 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.375 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.375 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.376 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.376 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.376 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.376 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.376 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.377 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.377 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.377 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.377 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.377 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.378 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.378 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.378 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.378 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.378 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.378 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.379 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.379 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.379 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.379 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.379 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.379 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.380 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.380 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.380 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.380 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.380 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.381 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.381 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.381 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.381 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.381 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.381 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.381 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.382 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.382 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.382 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.382 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.382 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.383 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.383 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.383 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.383 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.383 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.383 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.384 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.384 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.384 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.384 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.384 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.385 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.385 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.385 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.385 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.385 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.386 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.386 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.386 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.386 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.386 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.386 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.387 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.387 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.387 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.387 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.387 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.387 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.388 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.388 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.388 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.388 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.388 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.388 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.389 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.389 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.389 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.389 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.389 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.390 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.390 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.390 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.390 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.390 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.390 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.390 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.391 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.391 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.391 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.391 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.391 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.392 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.392 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.392 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.392 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.392 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.393 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.393 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.393 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.393 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.393 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.393 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.394 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.394 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.394 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.394 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.394 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.394 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.395 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.395 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.395 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.395 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.395 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.395 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.396 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.396 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.396 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.396 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.396 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.397 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.397 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.397 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.397 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.397 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.398 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.398 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.398 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.398 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.398 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.398 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.399 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.399 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.399 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.399 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.399 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.400 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.400 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.400 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.400 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.400 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.401 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.401 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.401 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.401 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.401 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.401 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.401 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.402 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.402 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.402 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.402 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.402 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.402 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.403 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.403 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.403 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.403 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.403 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.404 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.404 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.404 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.404 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.404 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.404 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.405 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.405 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.405 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.405 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.405 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.405 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.406 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.406 225332 WARNING oslo_config.cfg [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 26 10:01:15 compute-1 nova_compute[225328]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 26 10:01:15 compute-1 nova_compute[225328]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 26 10:01:15 compute-1 nova_compute[225328]: and ``live_migration_inbound_addr`` respectively.
Jan 26 10:01:15 compute-1 nova_compute[225328]: ).  Its value may be silently ignored in the future.
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.406 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.406 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.406 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.407 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.407 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.407 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.407 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.407 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.407 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.408 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.408 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.408 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.408 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.408 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.409 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.409 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.409 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.409 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.409 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.rbd_secret_uuid        = 1a70b85d-e3fd-5814-8a6a-37ea00fcae30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.409 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.409 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.410 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.410 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.410 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.410 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.410 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.410 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.411 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.411 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.411 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.411 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.411 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.411 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.412 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.412 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.412 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.412 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.412 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.412 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.413 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.413 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.413 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.413 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.413 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.413 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.414 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.414 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.414 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.414 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.414 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.414 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.415 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.415 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.415 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.415 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.415 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.415 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.415 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.416 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.416 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.416 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.416 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.416 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.416 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.417 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.417 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.417 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.417 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.417 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.417 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.418 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.418 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.418 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.418 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.418 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.418 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.419 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.419 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.419 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.419 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.419 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.419 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.420 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.420 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.420 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.420 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.420 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.421 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.421 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.421 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.421 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.421 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.422 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.422 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.422 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.422 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.422 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.423 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.423 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.423 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.423 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.423 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.423 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.423 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.424 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.424 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.424 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.424 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.424 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.425 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.425 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.425 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.425 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.425 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.425 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.426 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.426 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.426 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.426 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.426 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.426 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.426 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.427 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.427 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.427 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.427 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.427 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.427 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.428 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.428 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.428 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.428 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.428 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.428 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.428 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.429 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.429 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.429 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.429 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.430 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.430 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.430 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.430 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.430 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.430 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.431 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.431 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.431 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.431 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.431 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.431 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.432 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.432 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.432 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.432 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.432 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.432 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.433 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.433 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.433 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.433 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.433 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.433 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.434 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.434 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.434 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.434 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.434 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.434 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.435 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.435 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.435 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.435 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.435 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.436 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.436 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.436 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.436 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.436 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.437 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.437 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.437 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.437 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.437 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.437 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.438 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.438 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.438 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.438 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.438 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.438 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.438 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.439 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.439 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.439 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.439 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.439 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.439 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.440 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.440 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.440 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.440 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.440 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.440 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.441 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.441 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.441 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.441 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.441 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.441 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.442 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.442 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.442 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.442 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.442 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.442 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.443 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.443 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.443 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.443 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.443 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.443 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.444 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.444 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.444 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.444 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.444 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.445 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.445 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.445 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.445 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.445 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.445 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.446 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.446 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.446 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.446 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.446 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.446 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.447 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.447 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.447 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.447 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.447 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.448 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.448 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.448 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.448 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.448 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.449 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.449 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.449 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.449 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.449 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.450 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.450 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.450 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.450 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.451 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.451 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.451 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.451 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.451 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.452 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.452 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.452 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.452 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.452 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.453 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.453 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.453 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.453 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.453 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.454 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.454 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.454 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.454 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.454 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.454 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.455 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.455 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.455 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.455 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.455 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.455 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.456 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.456 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.456 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.456 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.456 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.456 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.457 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.457 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.457 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.457 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.457 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.458 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.458 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.458 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.458 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.458 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.458 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.459 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.459 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.459 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.459 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.459 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.460 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.460 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.460 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.460 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.460 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.460 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.461 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.461 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.461 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.461 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.461 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.462 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.462 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.462 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.462 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.462 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.462 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.463 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.463 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.463 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.463 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.463 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.463 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.464 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.464 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.464 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.464 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.464 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.464 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.465 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.465 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.465 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.465 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.465 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.465 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.466 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.466 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.466 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.466 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.466 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.466 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.467 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.467 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.467 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.467 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.467 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.467 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.468 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.468 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.468 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.468 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.468 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.468 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.468 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.469 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.469 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.469 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.469 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.469 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.469 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.470 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.470 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.470 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.470 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.470 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.470 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.471 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.471 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.471 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.471 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.471 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.471 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.471 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.472 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.472 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.472 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.472 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.472 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.473 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.473 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.473 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.473 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.473 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.474 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.474 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.474 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.474 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.474 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.474 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.475 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.475 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.475 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.475 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.475 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.475 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.476 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.476 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.476 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.476 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.476 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.477 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.477 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.477 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.477 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.477 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.477 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.477 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.478 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.478 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.478 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.478 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.478 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.479 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.479 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.479 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.479 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.479 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.481 225332 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.507 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.508 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.508 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.508 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Jan 26 10:01:15 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Jan 26 10:01:15 compute-1 systemd[1]: Started libvirt QEMU daemon.
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.594 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f948768a1f0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.598 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f948768a1f0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.599 225332 INFO nova.virt.libvirt.driver [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Connection event '1' reason 'None'
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.615 225332 WARNING nova.virt.libvirt.driver [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Jan 26 10:01:15 compute-1 nova_compute[225328]: 2026-01-26 10:01:15.617 225332 DEBUG nova.virt.libvirt.volume.mount [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 26 10:01:15 compute-1 python3.9[225836]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 10:01:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:01:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:16.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:16 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:16.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:16 compute-1 ceph-mon[80107]: pgmap v563: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 10:01:16 compute-1 nova_compute[225328]: 2026-01-26 10:01:16.541 225332 INFO nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Libvirt host capabilities <capabilities>
Jan 26 10:01:16 compute-1 nova_compute[225328]: 
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <host>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <uuid>0657a708-098a-4137-a4d8-8ea25323424c</uuid>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <cpu>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <arch>x86_64</arch>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model>EPYC-Rome-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <vendor>AMD</vendor>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <microcode version='16777317'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <signature family='23' model='49' stepping='0'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <maxphysaddr mode='emulate' bits='40'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature name='x2apic'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature name='tsc-deadline'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature name='osxsave'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature name='hypervisor'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature name='tsc_adjust'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature name='spec-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature name='stibp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature name='arch-capabilities'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature name='ssbd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature name='cmp_legacy'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature name='topoext'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature name='virt-ssbd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature name='lbrv'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature name='tsc-scale'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature name='vmcb-clean'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature name='pause-filter'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature name='pfthreshold'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature name='svme-addr-chk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature name='rdctl-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature name='skip-l1dfl-vmentry'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature name='mds-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature name='pschange-mc-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <pages unit='KiB' size='4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <pages unit='KiB' size='2048'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <pages unit='KiB' size='1048576'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </cpu>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <power_management>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <suspend_mem/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </power_management>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <iommu support='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <migration_features>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <live/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <uri_transports>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <uri_transport>tcp</uri_transport>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <uri_transport>rdma</uri_transport>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </uri_transports>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </migration_features>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <topology>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <cells num='1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <cell id='0'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:           <memory unit='KiB'>7864304</memory>
Jan 26 10:01:16 compute-1 nova_compute[225328]:           <pages unit='KiB' size='4'>1966076</pages>
Jan 26 10:01:16 compute-1 nova_compute[225328]:           <pages unit='KiB' size='2048'>0</pages>
Jan 26 10:01:16 compute-1 nova_compute[225328]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 26 10:01:16 compute-1 nova_compute[225328]:           <distances>
Jan 26 10:01:16 compute-1 nova_compute[225328]:             <sibling id='0' value='10'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:           </distances>
Jan 26 10:01:16 compute-1 nova_compute[225328]:           <cpus num='8'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:           </cpus>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         </cell>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </cells>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </topology>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <cache>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </cache>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <secmodel>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model>selinux</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <doi>0</doi>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </secmodel>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <secmodel>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model>dac</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <doi>0</doi>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </secmodel>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   </host>
Jan 26 10:01:16 compute-1 nova_compute[225328]: 
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <guest>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <os_type>hvm</os_type>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <arch name='i686'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <wordsize>32</wordsize>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <domain type='qemu'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <domain type='kvm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </arch>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <features>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <pae/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <nonpae/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <acpi default='on' toggle='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <apic default='on' toggle='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <cpuselection/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <deviceboot/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <disksnapshot default='on' toggle='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <externalSnapshot/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </features>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   </guest>
Jan 26 10:01:16 compute-1 nova_compute[225328]: 
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <guest>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <os_type>hvm</os_type>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <arch name='x86_64'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <wordsize>64</wordsize>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <domain type='qemu'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <domain type='kvm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </arch>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <features>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <acpi default='on' toggle='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <apic default='on' toggle='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <cpuselection/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <deviceboot/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <disksnapshot default='on' toggle='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <externalSnapshot/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </features>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   </guest>
Jan 26 10:01:16 compute-1 nova_compute[225328]: 
Jan 26 10:01:16 compute-1 nova_compute[225328]: </capabilities>
Jan 26 10:01:16 compute-1 nova_compute[225328]: 
Jan 26 10:01:16 compute-1 nova_compute[225328]: 2026-01-26 10:01:16.549 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 26 10:01:16 compute-1 nova_compute[225328]: 2026-01-26 10:01:16.574 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 26 10:01:16 compute-1 nova_compute[225328]: <domainCapabilities>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <path>/usr/libexec/qemu-kvm</path>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <domain>kvm</domain>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <arch>i686</arch>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <vcpu max='4096'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <iothreads supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <os supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <enum name='firmware'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <loader supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='type'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>rom</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>pflash</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='readonly'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>yes</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>no</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='secure'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>no</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </loader>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   </os>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <cpu>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <mode name='host-passthrough' supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='hostPassthroughMigratable'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>on</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>off</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </mode>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <mode name='maximum' supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='maximumMigratable'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>on</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>off</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </mode>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <mode name='host-model' supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <vendor>AMD</vendor>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='x2apic'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='tsc-deadline'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='hypervisor'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='tsc_adjust'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='spec-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='stibp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='ssbd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='cmp_legacy'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='overflow-recov'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='succor'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='ibrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='amd-ssbd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='virt-ssbd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='lbrv'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='tsc-scale'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='vmcb-clean'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='flushbyasid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='pause-filter'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='pfthreshold'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='svme-addr-chk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='disable' name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </mode>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <mode name='custom' supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-noTSX'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server-v5'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='ClearwaterForest'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bhi-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cmpccxadd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ddpd-u'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='intel-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='lam'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchiti'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sha512'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sm3'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sm4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='ClearwaterForest-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bhi-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cmpccxadd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ddpd-u'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='intel-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='lam'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchiti'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sha512'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sm3'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sm4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cooperlake'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cooperlake-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cooperlake-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Denverton'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mpx'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Denverton-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mpx'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Denverton-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Denverton-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Dhyana-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Genoa'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='auto-ibrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Genoa-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='auto-ibrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Genoa-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='auto-ibrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fs-gs-base-ns'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='perfmon-v2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Milan'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Milan-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Milan-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Milan-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Rome'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Rome-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Rome-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Rome-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Turin'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='auto-ibrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vp2intersect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fs-gs-base-ns'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibpb-brtype'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='perfmon-v2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbpb'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='srso-user-kernel-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Turin-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='auto-ibrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vp2intersect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fs-gs-base-ns'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibpb-brtype'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='perfmon-v2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbpb'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='srso-user-kernel-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-v5'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='GraniteRapids'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchiti'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='GraniteRapids-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchiti'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='GraniteRapids-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10-128'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10-256'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10-512'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchiti'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='GraniteRapids-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10-128'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10-256'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10-512'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchiti'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-noTSX'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-noTSX'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v5'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v6'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v7'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='IvyBridge'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='IvyBridge-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='IvyBridge-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='IvyBridge-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='KnightsMill'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-4fmaps'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-4vnniw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512er'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512pf'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='KnightsMill-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-4fmaps'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-4vnniw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512er'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512pf'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Opteron_G4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fma4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xop'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Opteron_G4-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fma4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xop'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Opteron_G5'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fma4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tbm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xop'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Opteron_G5-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fma4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tbm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xop'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SapphireRapids'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SapphireRapids-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SapphireRapids-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SapphireRapids-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SapphireRapids-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SierraForest'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cmpccxadd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SierraForest-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cmpccxadd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SierraForest-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cmpccxadd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='intel-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='lam'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SierraForest-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cmpccxadd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='intel-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='lam'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-v5'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Snowridge'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='core-capability'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mpx'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='split-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Snowridge-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='core-capability'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mpx'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='split-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Snowridge-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='core-capability'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='split-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Snowridge-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='core-capability'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='split-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Snowridge-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='athlon'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnow'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnowext'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='athlon-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnow'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnowext'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='core2duo'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='core2duo-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='coreduo'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='coreduo-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='n270'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='n270-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='phenom'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnow'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnowext'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='phenom-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnow'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnowext'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </mode>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   </cpu>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <memoryBacking supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <enum name='sourceType'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <value>file</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <value>anonymous</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <value>memfd</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   </memoryBacking>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <devices>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <disk supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='diskDevice'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>disk</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>cdrom</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>floppy</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>lun</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='bus'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>fdc</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>scsi</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>usb</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>sata</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='model'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio-transitional</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio-non-transitional</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </disk>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <graphics supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='type'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>vnc</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>egl-headless</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>dbus</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </graphics>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <video supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='modelType'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>vga</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>cirrus</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>none</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>bochs</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>ramfb</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </video>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <hostdev supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='mode'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>subsystem</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='startupPolicy'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>default</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>mandatory</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>requisite</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>optional</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='subsysType'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>usb</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>pci</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>scsi</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='capsType'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='pciBackend'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </hostdev>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <rng supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='model'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio-transitional</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio-non-transitional</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='backendModel'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>random</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>egd</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>builtin</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </rng>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <filesystem supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='driverType'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>path</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>handle</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtiofs</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </filesystem>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <tpm supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='model'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>tpm-tis</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>tpm-crb</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='backendModel'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>emulator</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>external</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='backendVersion'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>2.0</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </tpm>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <redirdev supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='bus'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>usb</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </redirdev>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <channel supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='type'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>pty</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>unix</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </channel>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <crypto supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='model'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='type'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>qemu</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='backendModel'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>builtin</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </crypto>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <interface supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='backendType'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>default</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>passt</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </interface>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <panic supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='model'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>isa</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>hyperv</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </panic>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <console supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='type'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>null</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>vc</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>pty</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>dev</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>file</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>pipe</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>stdio</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>udp</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>tcp</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>unix</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>qemu-vdagent</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>dbus</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </console>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   </devices>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <features>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <gic supported='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <vmcoreinfo supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <genid supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <backingStoreInput supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <backup supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <async-teardown supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <s390-pv supported='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <ps2 supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <tdx supported='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <sev supported='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <sgx supported='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <hyperv supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='features'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>relaxed</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>vapic</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>spinlocks</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>vpindex</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>runtime</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>synic</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>stimer</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>reset</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>vendor_id</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>frequencies</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>reenlightenment</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>tlbflush</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>ipi</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>avic</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>emsr_bitmap</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>xmm_input</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <defaults>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <spinlocks>4095</spinlocks>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <stimer_direct>on</stimer_direct>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <tlbflush_direct>on</tlbflush_direct>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <tlbflush_extended>on</tlbflush_extended>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </defaults>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </hyperv>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <launchSecurity supported='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   </features>
Jan 26 10:01:16 compute-1 nova_compute[225328]: </domainCapabilities>
Jan 26 10:01:16 compute-1 nova_compute[225328]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 26 10:01:16 compute-1 nova_compute[225328]: 2026-01-26 10:01:16.582 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 26 10:01:16 compute-1 nova_compute[225328]: <domainCapabilities>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <path>/usr/libexec/qemu-kvm</path>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <domain>kvm</domain>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <arch>i686</arch>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <vcpu max='240'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <iothreads supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <os supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <enum name='firmware'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <loader supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='type'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>rom</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>pflash</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='readonly'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>yes</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>no</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='secure'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>no</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </loader>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   </os>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <cpu>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <mode name='host-passthrough' supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='hostPassthroughMigratable'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>on</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>off</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </mode>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <mode name='maximum' supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='maximumMigratable'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>on</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>off</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </mode>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <mode name='host-model' supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <vendor>AMD</vendor>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='x2apic'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='tsc-deadline'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='hypervisor'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='tsc_adjust'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='spec-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='stibp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='ssbd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='cmp_legacy'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='overflow-recov'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='succor'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='ibrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='amd-ssbd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='virt-ssbd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='lbrv'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='tsc-scale'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='vmcb-clean'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='flushbyasid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='pause-filter'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='pfthreshold'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='svme-addr-chk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='disable' name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </mode>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <mode name='custom' supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-noTSX'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server-v5'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='ClearwaterForest'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bhi-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cmpccxadd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ddpd-u'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='intel-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='lam'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchiti'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sha512'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sm3'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sm4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='ClearwaterForest-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bhi-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cmpccxadd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ddpd-u'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='intel-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='lam'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchiti'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sha512'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sm3'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sm4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cooperlake'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cooperlake-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 podman[225905]: 2026-01-26 10:01:16.656502787 +0000 UTC m=+0.101892322 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cooperlake-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Denverton'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mpx'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Denverton-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mpx'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Denverton-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Denverton-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Dhyana-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Genoa'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='auto-ibrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Genoa-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='auto-ibrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Genoa-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='auto-ibrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fs-gs-base-ns'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='perfmon-v2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Milan'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Milan-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Milan-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Milan-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Rome'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Rome-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Rome-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Rome-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Turin'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='auto-ibrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vp2intersect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fs-gs-base-ns'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibpb-brtype'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='perfmon-v2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbpb'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='srso-user-kernel-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Turin-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='auto-ibrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vp2intersect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fs-gs-base-ns'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibpb-brtype'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='perfmon-v2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbpb'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='srso-user-kernel-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-v5'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='GraniteRapids'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchiti'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='GraniteRapids-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchiti'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='GraniteRapids-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10-128'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10-256'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10-512'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchiti'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='GraniteRapids-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10-128'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10-256'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10-512'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchiti'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-noTSX'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-noTSX'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v5'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v6'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v7'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='IvyBridge'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='IvyBridge-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='IvyBridge-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='IvyBridge-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='KnightsMill'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-4fmaps'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-4vnniw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512er'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512pf'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='KnightsMill-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-4fmaps'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-4vnniw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512er'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512pf'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Opteron_G4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fma4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xop'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Opteron_G4-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fma4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xop'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Opteron_G5'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fma4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tbm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xop'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Opteron_G5-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fma4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tbm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xop'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SapphireRapids'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SapphireRapids-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SapphireRapids-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SapphireRapids-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SapphireRapids-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SierraForest'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cmpccxadd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SierraForest-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cmpccxadd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SierraForest-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cmpccxadd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='intel-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='lam'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SierraForest-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cmpccxadd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='intel-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='lam'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-v5'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Snowridge'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='core-capability'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mpx'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='split-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Snowridge-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='core-capability'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mpx'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='split-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Snowridge-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='core-capability'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='split-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Snowridge-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='core-capability'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='split-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Snowridge-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='athlon'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnow'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnowext'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='athlon-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnow'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnowext'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='core2duo'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='core2duo-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='coreduo'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='coreduo-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='n270'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='n270-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='phenom'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnow'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnowext'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='phenom-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnow'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnowext'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </mode>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   </cpu>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <memoryBacking supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <enum name='sourceType'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <value>file</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <value>anonymous</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <value>memfd</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   </memoryBacking>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <devices>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <disk supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='diskDevice'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>disk</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>cdrom</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>floppy</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>lun</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='bus'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>ide</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>fdc</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>scsi</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>usb</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>sata</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='model'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio-transitional</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio-non-transitional</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </disk>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <graphics supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='type'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>vnc</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>egl-headless</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>dbus</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </graphics>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <video supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='modelType'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>vga</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>cirrus</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>none</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>bochs</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>ramfb</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </video>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <hostdev supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='mode'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>subsystem</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='startupPolicy'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>default</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>mandatory</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>requisite</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>optional</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='subsysType'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>usb</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>pci</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>scsi</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='capsType'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='pciBackend'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </hostdev>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <rng supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='model'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio-transitional</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio-non-transitional</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='backendModel'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>random</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>egd</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>builtin</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </rng>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <filesystem supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='driverType'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>path</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>handle</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtiofs</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </filesystem>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <tpm supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='model'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>tpm-tis</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>tpm-crb</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='backendModel'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>emulator</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>external</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='backendVersion'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>2.0</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </tpm>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <redirdev supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='bus'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>usb</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </redirdev>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <channel supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='type'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>pty</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>unix</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </channel>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <crypto supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='model'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='type'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>qemu</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='backendModel'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>builtin</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </crypto>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <interface supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='backendType'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>default</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>passt</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </interface>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <panic supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='model'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>isa</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>hyperv</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </panic>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <console supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='type'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>null</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>vc</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>pty</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>dev</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>file</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>pipe</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>stdio</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>udp</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>tcp</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>unix</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>qemu-vdagent</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>dbus</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </console>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   </devices>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <features>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <gic supported='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <vmcoreinfo supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <genid supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <backingStoreInput supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <backup supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <async-teardown supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <s390-pv supported='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <ps2 supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <tdx supported='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <sev supported='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <sgx supported='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <hyperv supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='features'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>relaxed</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>vapic</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>spinlocks</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>vpindex</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>runtime</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>synic</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>stimer</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>reset</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>vendor_id</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>frequencies</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>reenlightenment</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>tlbflush</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>ipi</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>avic</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>emsr_bitmap</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>xmm_input</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <defaults>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <spinlocks>4095</spinlocks>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <stimer_direct>on</stimer_direct>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <tlbflush_direct>on</tlbflush_direct>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <tlbflush_extended>on</tlbflush_extended>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </defaults>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </hyperv>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <launchSecurity supported='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   </features>
Jan 26 10:01:16 compute-1 nova_compute[225328]: </domainCapabilities>
Jan 26 10:01:16 compute-1 nova_compute[225328]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 26 10:01:16 compute-1 nova_compute[225328]: 2026-01-26 10:01:16.638 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 26 10:01:16 compute-1 nova_compute[225328]: 2026-01-26 10:01:16.642 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 26 10:01:16 compute-1 nova_compute[225328]: <domainCapabilities>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <path>/usr/libexec/qemu-kvm</path>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <domain>kvm</domain>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <arch>x86_64</arch>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <vcpu max='4096'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <iothreads supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <os supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <enum name='firmware'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <value>efi</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <loader supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='type'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>rom</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>pflash</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='readonly'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>yes</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>no</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='secure'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>yes</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>no</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </loader>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   </os>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <cpu>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <mode name='host-passthrough' supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='hostPassthroughMigratable'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>on</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>off</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </mode>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <mode name='maximum' supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='maximumMigratable'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>on</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>off</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </mode>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <mode name='host-model' supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <vendor>AMD</vendor>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='x2apic'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='tsc-deadline'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='hypervisor'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='tsc_adjust'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='spec-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='stibp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='ssbd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='cmp_legacy'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='overflow-recov'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='succor'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='ibrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='amd-ssbd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='virt-ssbd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='lbrv'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='tsc-scale'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='vmcb-clean'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='flushbyasid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='pause-filter'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='pfthreshold'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='svme-addr-chk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='disable' name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </mode>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <mode name='custom' supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-noTSX'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server-v5'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='ClearwaterForest'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bhi-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cmpccxadd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ddpd-u'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='intel-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='lam'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchiti'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sha512'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sm3'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sm4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='ClearwaterForest-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bhi-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cmpccxadd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ddpd-u'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='intel-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='lam'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchiti'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sha512'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sm3'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sm4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cooperlake'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cooperlake-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cooperlake-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Denverton'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mpx'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Denverton-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mpx'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Denverton-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Denverton-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Dhyana-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Genoa'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='auto-ibrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Genoa-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='auto-ibrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Genoa-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='auto-ibrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fs-gs-base-ns'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='perfmon-v2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Milan'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Milan-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Milan-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Milan-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Rome'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Rome-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Rome-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Rome-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Turin'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='auto-ibrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vp2intersect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fs-gs-base-ns'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibpb-brtype'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='perfmon-v2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbpb'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='srso-user-kernel-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Turin-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='auto-ibrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vp2intersect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fs-gs-base-ns'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibpb-brtype'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='perfmon-v2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbpb'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='srso-user-kernel-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-v5'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='GraniteRapids'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchiti'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='GraniteRapids-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchiti'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='GraniteRapids-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10-128'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10-256'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10-512'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchiti'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='GraniteRapids-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10-128'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10-256'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10-512'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchiti'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-noTSX'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-noTSX'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v5'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v6'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v7'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='IvyBridge'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='IvyBridge-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='IvyBridge-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='IvyBridge-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='KnightsMill'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-4fmaps'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-4vnniw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512er'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512pf'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='KnightsMill-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-4fmaps'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-4vnniw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512er'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512pf'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Opteron_G4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fma4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xop'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Opteron_G4-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fma4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xop'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Opteron_G5'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fma4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tbm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xop'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Opteron_G5-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fma4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tbm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xop'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SapphireRapids'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SapphireRapids-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SapphireRapids-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SapphireRapids-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SapphireRapids-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SierraForest'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cmpccxadd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SierraForest-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cmpccxadd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SierraForest-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cmpccxadd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='intel-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='lam'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SierraForest-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cmpccxadd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='intel-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='lam'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-v5'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Snowridge'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='core-capability'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mpx'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='split-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Snowridge-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='core-capability'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mpx'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='split-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Snowridge-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='core-capability'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='split-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Snowridge-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='core-capability'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='split-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Snowridge-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='athlon'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnow'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnowext'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='athlon-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnow'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnowext'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='core2duo'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='core2duo-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='coreduo'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='coreduo-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='n270'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='n270-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='phenom'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnow'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnowext'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='phenom-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnow'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnowext'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </mode>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   </cpu>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <memoryBacking supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <enum name='sourceType'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <value>file</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <value>anonymous</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <value>memfd</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   </memoryBacking>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <devices>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <disk supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='diskDevice'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>disk</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>cdrom</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>floppy</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>lun</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='bus'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>fdc</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>scsi</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>usb</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>sata</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='model'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio-transitional</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio-non-transitional</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </disk>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <graphics supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='type'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>vnc</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>egl-headless</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>dbus</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </graphics>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <video supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='modelType'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>vga</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>cirrus</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>none</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>bochs</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>ramfb</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </video>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <hostdev supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='mode'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>subsystem</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='startupPolicy'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>default</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>mandatory</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>requisite</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>optional</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='subsysType'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>usb</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>pci</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>scsi</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='capsType'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='pciBackend'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </hostdev>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <rng supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='model'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio-transitional</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio-non-transitional</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='backendModel'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>random</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>egd</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>builtin</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </rng>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <filesystem supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='driverType'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>path</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>handle</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtiofs</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </filesystem>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <tpm supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='model'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>tpm-tis</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>tpm-crb</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='backendModel'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>emulator</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>external</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='backendVersion'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>2.0</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </tpm>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <redirdev supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='bus'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>usb</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </redirdev>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <channel supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='type'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>pty</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>unix</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </channel>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <crypto supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='model'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='type'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>qemu</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='backendModel'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>builtin</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </crypto>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <interface supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='backendType'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>default</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>passt</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </interface>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <panic supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='model'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>isa</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>hyperv</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </panic>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <console supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='type'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>null</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>vc</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>pty</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>dev</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>file</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>pipe</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>stdio</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>udp</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>tcp</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>unix</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>qemu-vdagent</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>dbus</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </console>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   </devices>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <features>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <gic supported='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <vmcoreinfo supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <genid supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <backingStoreInput supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <backup supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <async-teardown supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <s390-pv supported='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <ps2 supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <tdx supported='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <sev supported='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <sgx supported='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <hyperv supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='features'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>relaxed</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>vapic</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>spinlocks</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>vpindex</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>runtime</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>synic</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>stimer</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>reset</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>vendor_id</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>frequencies</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>reenlightenment</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>tlbflush</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>ipi</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>avic</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>emsr_bitmap</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>xmm_input</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <defaults>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <spinlocks>4095</spinlocks>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <stimer_direct>on</stimer_direct>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <tlbflush_direct>on</tlbflush_direct>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <tlbflush_extended>on</tlbflush_extended>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </defaults>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </hyperv>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <launchSecurity supported='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   </features>
Jan 26 10:01:16 compute-1 nova_compute[225328]: </domainCapabilities>
Jan 26 10:01:16 compute-1 nova_compute[225328]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 26 10:01:16 compute-1 nova_compute[225328]: 2026-01-26 10:01:16.740 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 26 10:01:16 compute-1 nova_compute[225328]: <domainCapabilities>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <path>/usr/libexec/qemu-kvm</path>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <domain>kvm</domain>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <arch>x86_64</arch>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <vcpu max='240'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <iothreads supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <os supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <enum name='firmware'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <loader supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='type'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>rom</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>pflash</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='readonly'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>yes</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>no</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='secure'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>no</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </loader>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   </os>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <cpu>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <mode name='host-passthrough' supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='hostPassthroughMigratable'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>on</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>off</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </mode>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <mode name='maximum' supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='maximumMigratable'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>on</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>off</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </mode>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <mode name='host-model' supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <vendor>AMD</vendor>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='x2apic'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='tsc-deadline'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='hypervisor'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='tsc_adjust'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='spec-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='stibp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='ssbd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='cmp_legacy'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='overflow-recov'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='succor'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='ibrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='amd-ssbd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='virt-ssbd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='lbrv'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='tsc-scale'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='vmcb-clean'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='flushbyasid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='pause-filter'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='pfthreshold'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='svme-addr-chk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <feature policy='disable' name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </mode>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <mode name='custom' supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-noTSX'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Broadwell-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cascadelake-Server-v5'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='ClearwaterForest'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bhi-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cmpccxadd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ddpd-u'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='intel-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='lam'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchiti'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sha512'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sm3'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sm4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='ClearwaterForest-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bhi-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cmpccxadd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ddpd-u'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='intel-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='lam'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchiti'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sha512'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sm3'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sm4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cooperlake'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cooperlake-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Cooperlake-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Denverton'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mpx'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Denverton-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mpx'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Denverton-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Denverton-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Dhyana-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Genoa'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='auto-ibrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Genoa-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='auto-ibrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Genoa-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='auto-ibrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fs-gs-base-ns'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='perfmon-v2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Milan'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Milan-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Milan-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Milan-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Rome'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Rome-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Rome-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Rome-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Turin'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='auto-ibrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vp2intersect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fs-gs-base-ns'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibpb-brtype'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='perfmon-v2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbpb'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='srso-user-kernel-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-Turin-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amd-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='auto-ibrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vp2intersect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fs-gs-base-ns'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibpb-brtype'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='perfmon-v2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbpb'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='srso-user-kernel-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='stibp-always-on'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='EPYC-v5'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='GraniteRapids'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchiti'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='GraniteRapids-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchiti'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='GraniteRapids-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10-128'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10-256'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10-512'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchiti'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='GraniteRapids-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10-128'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10-256'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx10-512'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='prefetchiti'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-noTSX'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Haswell-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-noTSX'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v5'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v6'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Icelake-Server-v7'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='IvyBridge'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='IvyBridge-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='IvyBridge-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='IvyBridge-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='KnightsMill'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-4fmaps'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-4vnniw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512er'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512pf'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='KnightsMill-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-4fmaps'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-4vnniw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512er'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512pf'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Opteron_G4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fma4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xop'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Opteron_G4-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fma4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xop'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Opteron_G5'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fma4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tbm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xop'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Opteron_G5-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fma4'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tbm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xop'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SapphireRapids'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SapphireRapids-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SapphireRapids-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SapphireRapids-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SapphireRapids-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='amx-tile'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-bf16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-fp16'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bitalg'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrc'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fzrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='la57'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='taa-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SierraForest'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cmpccxadd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SierraForest-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cmpccxadd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SierraForest-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cmpccxadd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='intel-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='lam'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='SierraForest-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ifma'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cmpccxadd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fbsdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='fsrs'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ibrs-all'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='intel-psfd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='lam'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mcdt-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pbrsb-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='psdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='serialize'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vaes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Client-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='hle'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='rtm'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Skylake-Server-v5'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512bw'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512cd'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512dq'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512f'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='avx512vl'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='invpcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pcid'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='pku'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Snowridge'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='core-capability'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mpx'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='split-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Snowridge-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='core-capability'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='mpx'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='split-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Snowridge-v2'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='core-capability'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='split-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Snowridge-v3'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='core-capability'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='split-lock-detect'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='Snowridge-v4'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='cldemote'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='erms'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='gfni'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdir64b'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='movdiri'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='xsaves'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='athlon'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnow'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnowext'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='athlon-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnow'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnowext'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='core2duo'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='core2duo-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='coreduo'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='coreduo-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='n270'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='n270-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='ss'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='phenom'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnow'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnowext'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <blockers model='phenom-v1'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnow'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <feature name='3dnowext'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </blockers>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </mode>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   </cpu>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <memoryBacking supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <enum name='sourceType'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <value>file</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <value>anonymous</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <value>memfd</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   </memoryBacking>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <devices>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <disk supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='diskDevice'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>disk</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>cdrom</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>floppy</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>lun</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='bus'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>ide</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>fdc</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>scsi</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>usb</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>sata</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='model'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio-transitional</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio-non-transitional</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </disk>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <graphics supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='type'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>vnc</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>egl-headless</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>dbus</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </graphics>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <video supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='modelType'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>vga</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>cirrus</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>none</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>bochs</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>ramfb</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </video>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <hostdev supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='mode'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>subsystem</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='startupPolicy'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>default</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>mandatory</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>requisite</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>optional</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='subsysType'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>usb</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>pci</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>scsi</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='capsType'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='pciBackend'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </hostdev>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <rng supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='model'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio-transitional</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtio-non-transitional</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='backendModel'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>random</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>egd</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>builtin</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </rng>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <filesystem supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='driverType'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>path</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>handle</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>virtiofs</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </filesystem>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <tpm supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='model'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>tpm-tis</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>tpm-crb</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='backendModel'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>emulator</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>external</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='backendVersion'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>2.0</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </tpm>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <redirdev supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='bus'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>usb</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </redirdev>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <channel supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='type'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>pty</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>unix</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </channel>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <crypto supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='model'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='type'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>qemu</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='backendModel'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>builtin</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </crypto>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <interface supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='backendType'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>default</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>passt</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </interface>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <panic supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='model'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>isa</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>hyperv</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </panic>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <console supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='type'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>null</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>vc</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>pty</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>dev</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>file</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>pipe</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>stdio</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>udp</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>tcp</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>unix</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>qemu-vdagent</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>dbus</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </console>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   </devices>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   <features>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <gic supported='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <vmcoreinfo supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <genid supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <backingStoreInput supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <backup supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <async-teardown supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <s390-pv supported='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <ps2 supported='yes'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <tdx supported='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <sev supported='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <sgx supported='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <hyperv supported='yes'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <enum name='features'>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>relaxed</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>vapic</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>spinlocks</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>vpindex</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>runtime</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>synic</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>stimer</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>reset</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>vendor_id</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>frequencies</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>reenlightenment</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>tlbflush</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>ipi</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>avic</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>emsr_bitmap</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <value>xmm_input</value>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </enum>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       <defaults>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <spinlocks>4095</spinlocks>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <stimer_direct>on</stimer_direct>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <tlbflush_direct>on</tlbflush_direct>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <tlbflush_extended>on</tlbflush_extended>
Jan 26 10:01:16 compute-1 nova_compute[225328]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 10:01:16 compute-1 nova_compute[225328]:       </defaults>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     </hyperv>
Jan 26 10:01:16 compute-1 nova_compute[225328]:     <launchSecurity supported='no'/>
Jan 26 10:01:16 compute-1 nova_compute[225328]:   </features>
Jan 26 10:01:16 compute-1 nova_compute[225328]: </domainCapabilities>
Jan 26 10:01:16 compute-1 nova_compute[225328]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 26 10:01:16 compute-1 nova_compute[225328]: 2026-01-26 10:01:16.813 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 26 10:01:16 compute-1 nova_compute[225328]: 2026-01-26 10:01:16.813 225332 INFO nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Secure Boot support detected
Jan 26 10:01:16 compute-1 nova_compute[225328]: 2026-01-26 10:01:16.815 225332 INFO nova.virt.libvirt.driver [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 26 10:01:16 compute-1 nova_compute[225328]: 2026-01-26 10:01:16.815 225332 INFO nova.virt.libvirt.driver [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 26 10:01:16 compute-1 nova_compute[225328]: 2026-01-26 10:01:16.824 225332 DEBUG nova.virt.libvirt.driver [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Jan 26 10:01:16 compute-1 nova_compute[225328]: 2026-01-26 10:01:16.866 225332 INFO nova.virt.node [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Determined node identity d06842a0-5d13-4573-bb78-d433bbb380e4 from /var/lib/nova/compute_id
Jan 26 10:01:16 compute-1 nova_compute[225328]: 2026-01-26 10:01:16.975 225332 WARNING nova.compute.manager [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Compute nodes ['d06842a0-5d13-4573-bb78-d433bbb380e4'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Jan 26 10:01:17 compute-1 nova_compute[225328]: 2026-01-26 10:01:17.007 225332 INFO nova.compute.manager [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 26 10:01:17 compute-1 nova_compute[225328]: 2026-01-26 10:01:17.046 225332 WARNING nova.compute.manager [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Jan 26 10:01:17 compute-1 nova_compute[225328]: 2026-01-26 10:01:17.046 225332 DEBUG oslo_concurrency.lockutils [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:01:17 compute-1 nova_compute[225328]: 2026-01-26 10:01:17.047 225332 DEBUG oslo_concurrency.lockutils [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:01:17 compute-1 nova_compute[225328]: 2026-01-26 10:01:17.047 225332 DEBUG oslo_concurrency.lockutils [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:01:17 compute-1 nova_compute[225328]: 2026-01-26 10:01:17.047 225332 DEBUG nova.compute.resource_tracker [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:01:17 compute-1 nova_compute[225328]: 2026-01-26 10:01:17.048 225332 DEBUG oslo_concurrency.processutils [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:01:17 compute-1 sshd-session[225881]: Invalid user ubuntu from 104.248.198.71 port 33138
Jan 26 10:01:17 compute-1 sudo[226037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awfuhsitywiqovmkxtmgtybbggcwmcyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421676.5532758-3567-195262873773911/AnsiballZ_podman_container.py'
Jan 26 10:01:17 compute-1 sudo[226037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:01:17 compute-1 sshd-session[225881]: Connection closed by invalid user ubuntu 104.248.198.71 port 33138 [preauth]
Jan 26 10:01:17 compute-1 python3.9[226039]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 26 10:01:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100117 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 10:01:17 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 10:01:17 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 10:01:17 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:01:17 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1791287281' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:01:17 compute-1 nova_compute[225328]: 2026-01-26 10:01:17.539 225332 DEBUG oslo_concurrency.processutils [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:01:17 compute-1 sudo[226037]: pam_unix(sudo:session): session closed for user root
Jan 26 10:01:17 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Jan 26 10:01:17 compute-1 systemd[1]: Started libvirt nodedev daemon.
Jan 26 10:01:17 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/369882354' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:01:17 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:01:17 compute-1 nova_compute[225328]: 2026-01-26 10:01:17.846 225332 WARNING nova.virt.libvirt.driver [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:01:17 compute-1 nova_compute[225328]: 2026-01-26 10:01:17.847 225332 DEBUG nova.compute.resource_tracker [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5295MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:01:17 compute-1 nova_compute[225328]: 2026-01-26 10:01:17.847 225332 DEBUG oslo_concurrency.lockutils [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:01:17 compute-1 nova_compute[225328]: 2026-01-26 10:01:17.848 225332 DEBUG oslo_concurrency.lockutils [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:01:17 compute-1 nova_compute[225328]: 2026-01-26 10:01:17.862 225332 WARNING nova.compute.resource_tracker [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] No compute node record for compute-1.ctlplane.example.com:d06842a0-5d13-4573-bb78-d433bbb380e4: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host d06842a0-5d13-4573-bb78-d433bbb380e4 could not be found.
Jan 26 10:01:17 compute-1 nova_compute[225328]: 2026-01-26 10:01:17.896 225332 INFO nova.compute.resource_tracker [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: d06842a0-5d13-4573-bb78-d433bbb380e4
Jan 26 10:01:17 compute-1 nova_compute[225328]: 2026-01-26 10:01:17.965 225332 DEBUG nova.compute.resource_tracker [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:01:17 compute-1 nova_compute[225328]: 2026-01-26 10:01:17.965 225332 DEBUG nova.compute.resource_tracker [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:01:18 compute-1 sudo[226258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhbjlwhfabrodgarshgevtxvehzxewat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421677.8087466-3591-175025847081067/AnsiballZ_systemd.py'
Jan 26 10:01:18 compute-1 sudo[226258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:01:18 compute-1 python3.9[226260]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 10:01:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:18.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:18.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:18 compute-1 systemd[1]: Stopping nova_compute container...
Jan 26 10:01:18 compute-1 nova_compute[225328]: 2026-01-26 10:01:18.458 225332 DEBUG oslo_concurrency.lockutils [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.610s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:01:18 compute-1 nova_compute[225328]: 2026-01-26 10:01:18.459 225332 DEBUG oslo_concurrency.lockutils [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 10:01:18 compute-1 nova_compute[225328]: 2026-01-26 10:01:18.459 225332 DEBUG oslo_concurrency.lockutils [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 10:01:18 compute-1 nova_compute[225328]: 2026-01-26 10:01:18.459 225332 DEBUG oslo_concurrency.lockutils [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 10:01:18 compute-1 ceph-mon[80107]: pgmap v564: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:01:18 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1791287281' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:01:18 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/686553642' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:01:18 compute-1 virtqemud[225791]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 26 10:01:18 compute-1 virtqemud[225791]: hostname: compute-1
Jan 26 10:01:18 compute-1 virtqemud[225791]: End of file while reading data: Input/output error
Jan 26 10:01:18 compute-1 systemd[1]: libpod-e5e330867e7e5da6ef9c6b9af3e2d2c9ff3fe5beb198d40aa1d27e78e35090e6.scope: Deactivated successfully.
Jan 26 10:01:18 compute-1 systemd[1]: libpod-e5e330867e7e5da6ef9c6b9af3e2d2c9ff3fe5beb198d40aa1d27e78e35090e6.scope: Consumed 3.675s CPU time.
Jan 26 10:01:18 compute-1 podman[226264]: 2026-01-26 10:01:18.87747575 +0000 UTC m=+0.481365578 container died e5e330867e7e5da6ef9c6b9af3e2d2c9ff3fe5beb198d40aa1d27e78e35090e6 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 10:01:19 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e5e330867e7e5da6ef9c6b9af3e2d2c9ff3fe5beb198d40aa1d27e78e35090e6-userdata-shm.mount: Deactivated successfully.
Jan 26 10:01:19 compute-1 systemd[1]: var-lib-containers-storage-overlay-bc1f2a961f3a254d4d6e926a9d6f4bf5de3b1a727c611bc615442b6a446b988a-merged.mount: Deactivated successfully.
Jan 26 10:01:19 compute-1 podman[226264]: 2026-01-26 10:01:19.271857593 +0000 UTC m=+0.875747391 container cleanup e5e330867e7e5da6ef9c6b9af3e2d2c9ff3fe5beb198d40aa1d27e78e35090e6 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 10:01:19 compute-1 podman[226264]: nova_compute
Jan 26 10:01:19 compute-1 podman[226294]: nova_compute
Jan 26 10:01:19 compute-1 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 26 10:01:19 compute-1 systemd[1]: Stopped nova_compute container.
Jan 26 10:01:19 compute-1 systemd[1]: Starting nova_compute container...
Jan 26 10:01:19 compute-1 systemd[1]: Started libcrun container.
Jan 26 10:01:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc1f2a961f3a254d4d6e926a9d6f4bf5de3b1a727c611bc615442b6a446b988a/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 26 10:01:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc1f2a961f3a254d4d6e926a9d6f4bf5de3b1a727c611bc615442b6a446b988a/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 26 10:01:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc1f2a961f3a254d4d6e926a9d6f4bf5de3b1a727c611bc615442b6a446b988a/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 26 10:01:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc1f2a961f3a254d4d6e926a9d6f4bf5de3b1a727c611bc615442b6a446b988a/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 26 10:01:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc1f2a961f3a254d4d6e926a9d6f4bf5de3b1a727c611bc615442b6a446b988a/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 26 10:01:19 compute-1 podman[226307]: 2026-01-26 10:01:19.464277644 +0000 UTC m=+0.090812809 container init e5e330867e7e5da6ef9c6b9af3e2d2c9ff3fe5beb198d40aa1d27e78e35090e6 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 10:01:19 compute-1 podman[226307]: 2026-01-26 10:01:19.472099978 +0000 UTC m=+0.098635123 container start e5e330867e7e5da6ef9c6b9af3e2d2c9ff3fe5beb198d40aa1d27e78e35090e6 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 10:01:19 compute-1 nova_compute[226322]: + sudo -E kolla_set_configs
Jan 26 10:01:19 compute-1 podman[226307]: nova_compute
Jan 26 10:01:19 compute-1 systemd[1]: Started nova_compute container.
Jan 26 10:01:19 compute-1 sudo[226258]: pam_unix(sudo:session): session closed for user root
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Validating config file
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Copying service configuration files
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Deleting /etc/ceph
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Creating directory /etc/ceph
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Setting permission for /etc/ceph
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Writing out command to execute
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 26 10:01:19 compute-1 nova_compute[226322]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 26 10:01:19 compute-1 nova_compute[226322]: ++ cat /run_command
Jan 26 10:01:19 compute-1 nova_compute[226322]: + CMD=nova-compute
Jan 26 10:01:19 compute-1 nova_compute[226322]: + ARGS=
Jan 26 10:01:19 compute-1 nova_compute[226322]: + sudo kolla_copy_cacerts
Jan 26 10:01:19 compute-1 nova_compute[226322]: + [[ ! -n '' ]]
Jan 26 10:01:19 compute-1 nova_compute[226322]: + . kolla_extend_start
Jan 26 10:01:19 compute-1 nova_compute[226322]: Running command: 'nova-compute'
Jan 26 10:01:19 compute-1 nova_compute[226322]: + echo 'Running command: '\''nova-compute'\'''
Jan 26 10:01:19 compute-1 nova_compute[226322]: + umask 0022
Jan 26 10:01:19 compute-1 nova_compute[226322]: + exec nova-compute
Jan 26 10:01:20 compute-1 sshd-session[226278]: Connection closed by authenticating user root 178.62.249.31 port 45802 [preauth]
Jan 26 10:01:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:20.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:20.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:20 compute-1 sudo[226360]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:01:20 compute-1 sudo[226360]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:01:20 compute-1 sudo[226360]: pam_unix(sudo:session): session closed for user root
Jan 26 10:01:21 compute-1 nova_compute[226322]: 2026-01-26 10:01:21.515 226326 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 26 10:01:21 compute-1 nova_compute[226322]: 2026-01-26 10:01:21.515 226326 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 26 10:01:21 compute-1 nova_compute[226322]: 2026-01-26 10:01:21.515 226326 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 26 10:01:21 compute-1 nova_compute[226322]: 2026-01-26 10:01:21.516 226326 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 26 10:01:21 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:01:21 compute-1 ceph-mon[80107]: pgmap v565: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:01:21 compute-1 nova_compute[226322]: 2026-01-26 10:01:21.650 226326 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:01:21 compute-1 nova_compute[226322]: 2026-01-26 10:01:21.676 226326 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:01:21 compute-1 nova_compute[226322]: 2026-01-26 10:01:21.676 226326 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.192 226326 INFO nova.virt.driver [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 26 10:01:22 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Scheduled restart job, restart counter is at 8.
Jan 26 10:01:22 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 10:01:22 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.924s CPU time.
Jan 26 10:01:22 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.323 226326 INFO nova.compute.provider_config [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.332 226326 DEBUG oslo_concurrency.lockutils [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.332 226326 DEBUG oslo_concurrency.lockutils [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.332 226326 DEBUG oslo_concurrency.lockutils [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.333 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.333 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.333 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.333 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.333 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.333 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.333 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.334 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.334 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.334 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.334 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.334 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.334 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.335 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.335 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.335 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.335 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.335 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.335 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.335 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.336 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.336 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.336 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.336 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.336 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.336 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.337 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.337 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.337 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.337 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.337 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.338 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.338 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.338 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.338 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.338 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.338 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.339 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.339 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.339 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.339 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.339 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.339 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.340 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.340 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.340 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.340 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.340 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.340 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.341 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.341 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.341 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.341 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.341 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.341 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.342 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.342 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.342 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.342 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.342 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.342 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.342 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.343 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.343 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.343 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.343 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.343 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.343 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.344 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.344 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.344 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.344 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.344 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.344 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.345 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.345 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.345 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.345 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.345 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.345 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.346 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.346 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.346 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.346 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.346 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.346 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.347 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.347 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.347 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.347 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.347 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.348 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.348 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.348 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.348 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.348 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.349 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.349 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.349 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.349 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.349 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.350 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.350 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.350 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.350 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.350 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.350 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.350 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.351 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.351 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.351 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.351 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.351 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.351 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.352 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.352 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.352 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.352 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.352 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.353 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.353 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.353 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.353 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.353 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.353 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.354 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.354 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.354 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.354 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.354 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.355 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.355 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.355 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.355 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.355 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.355 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.356 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.356 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.356 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.356 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.356 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.357 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.357 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.357 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.357 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.357 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.357 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.357 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.358 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.358 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.358 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.358 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.358 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.358 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.359 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.359 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.359 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:22.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.359 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.359 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.359 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.360 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.360 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:22.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.360 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.360 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.360 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.360 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.360 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.361 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.361 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.361 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.361 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.361 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.361 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.361 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.362 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.362 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.362 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.362 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.362 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.362 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.363 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.363 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.363 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.363 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.363 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.363 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.363 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.364 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.364 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.364 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.364 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.364 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.364 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.365 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.365 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.365 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.365 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.365 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.365 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.366 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.366 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.366 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.366 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.366 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.366 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.367 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.367 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.367 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.367 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.367 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.368 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.368 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.368 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.368 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.368 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.368 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.369 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.369 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.369 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.369 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.369 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.370 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.370 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.370 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.370 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.370 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.371 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.371 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.371 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.371 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.371 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.371 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.372 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.372 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.372 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.372 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.372 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.373 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.373 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.373 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.373 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.373 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.373 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.374 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.374 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.374 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.374 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.374 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.375 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.375 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.375 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.375 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.375 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.375 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.376 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.376 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.376 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.376 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.376 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.377 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.377 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.377 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.377 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.377 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.378 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.378 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.378 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.378 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.378 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.379 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.379 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.379 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.379 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.379 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.379 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.380 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.380 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.380 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.380 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.380 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.381 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.381 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.381 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.381 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.381 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.381 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.382 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.382 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.382 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.382 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.382 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.383 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.383 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.383 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.383 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.383 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.383 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.384 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.384 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.384 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.384 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.384 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.385 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.385 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.385 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.385 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.385 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.385 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.386 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.386 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.386 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.386 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.386 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.387 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.387 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.387 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.387 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.387 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.387 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.388 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.388 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.388 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.388 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.388 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.389 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.389 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.389 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.389 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.389 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.390 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.390 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.390 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.390 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.390 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.390 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.391 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.391 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.391 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.391 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.391 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.392 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.392 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.392 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.392 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.392 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.393 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.393 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.393 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.393 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.393 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.394 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.394 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.394 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.394 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.394 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.395 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.395 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.395 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.395 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.395 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.396 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.396 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.396 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.396 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.396 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.396 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.397 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.397 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.397 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.397 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.397 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.398 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.398 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.398 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.398 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.398 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.398 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.399 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.399 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.399 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.399 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.399 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.399 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.400 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.400 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.400 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.400 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.400 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.401 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.401 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.401 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.401 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.402 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.402 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.402 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.402 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.402 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.402 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.403 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.403 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.403 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.403 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.403 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.403 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.403 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.404 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.404 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.404 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.404 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.404 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.404 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.404 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.405 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.405 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.405 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.405 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.405 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.405 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.405 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.406 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.406 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.406 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.406 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.406 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.406 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.407 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.407 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.407 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.407 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.407 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.407 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.407 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.408 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.408 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.408 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.408 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.408 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.408 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.408 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.409 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.409 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.409 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.409 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.409 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.409 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.410 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.410 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.410 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.410 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.410 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.410 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.410 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.411 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.411 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.411 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.411 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.411 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.411 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.412 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.412 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.412 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.412 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.412 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.412 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.413 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.413 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.413 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.413 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.413 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.413 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.413 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.413 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.414 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.414 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.414 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.414 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.414 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.414 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.414 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.415 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.415 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.415 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.415 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.415 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.415 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.415 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.416 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.416 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.416 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.416 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.416 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.416 226326 WARNING oslo_config.cfg [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 26 10:01:22 compute-1 nova_compute[226322]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 26 10:01:22 compute-1 nova_compute[226322]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 26 10:01:22 compute-1 nova_compute[226322]: and ``live_migration_inbound_addr`` respectively.
Jan 26 10:01:22 compute-1 nova_compute[226322]: ).  Its value may be silently ignored in the future.
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.417 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.417 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.417 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.417 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.417 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.417 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.418 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.418 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.418 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.418 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.418 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.419 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.419 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.419 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.419 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.419 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.419 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.420 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.420 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.rbd_secret_uuid        = 1a70b85d-e3fd-5814-8a6a-37ea00fcae30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.420 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.420 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.420 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.420 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.420 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.421 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.421 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.421 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.421 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.421 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.422 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.422 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.422 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.422 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.422 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.423 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.423 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.423 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.423 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.423 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.423 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.424 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.424 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.424 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.424 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.424 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.424 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.425 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.425 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.425 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.425 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.425 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.425 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.426 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.426 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.426 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.426 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.426 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.426 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.426 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.427 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.427 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.427 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.427 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.427 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.427 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.427 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.428 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.428 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.428 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.428 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.428 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.428 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.428 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.429 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.429 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.429 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.429 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.429 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.429 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.430 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.430 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.430 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.430 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.430 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.430 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.431 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.431 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.431 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.431 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.431 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.431 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.431 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.432 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.432 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.432 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.432 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.432 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.432 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.432 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.433 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.433 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.433 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.433 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.433 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.433 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.433 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.434 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.434 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.434 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.434 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.434 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.434 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.434 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.435 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.435 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.435 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.435 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.435 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.435 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.435 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.436 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.436 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.436 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.436 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.436 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.436 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.437 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.437 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.437 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.437 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.437 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.437 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.438 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.438 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.438 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.438 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.438 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.439 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.439 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.439 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.439 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.439 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.440 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.440 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.440 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.440 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.440 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.440 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.440 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.441 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.441 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.441 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.441 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.441 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.441 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.441 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.442 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.442 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.442 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.442 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.442 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.442 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.443 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.443 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.444 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.444 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.444 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.444 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.445 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.445 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.445 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.445 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.445 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.446 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.446 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.446 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.446 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.446 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.446 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.447 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.447 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.447 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.447 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.447 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.448 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.448 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.448 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.448 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.448 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.448 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.448 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.449 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.449 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.449 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.449 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.449 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.449 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.450 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.450 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.450 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.450 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.450 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.450 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.451 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.451 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.451 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.451 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.452 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.452 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.452 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.452 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.452 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.452 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.453 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.453 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.453 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.453 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.453 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.453 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.453 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.454 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.454 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.454 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.454 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.454 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.454 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.455 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.455 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.455 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.455 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.455 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.455 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.456 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.456 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.456 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.456 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.456 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.457 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.457 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.457 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.457 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.457 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.458 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.458 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.458 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.458 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.458 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.459 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.459 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.459 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.459 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.460 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.460 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.460 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.460 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.460 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.460 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.461 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.461 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.461 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.461 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.461 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.462 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.462 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.462 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.462 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.462 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.463 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.463 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.463 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.463 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.463 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.464 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.464 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.464 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.464 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.464 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.464 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.465 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.465 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.465 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.465 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.465 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.466 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.466 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.466 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.466 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.466 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.466 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.466 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.467 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.467 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.467 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.467 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.467 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.468 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.468 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.468 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.468 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.468 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.468 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.468 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.469 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.469 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.469 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.469 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.469 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.469 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.470 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.470 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.470 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.470 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.470 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.470 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.471 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.471 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.471 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.471 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.471 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.472 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.472 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.472 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.472 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.472 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.472 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.473 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.473 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.473 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.473 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.473 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.474 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.474 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.474 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.474 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.474 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.474 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.474 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.475 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.475 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.475 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.475 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.475 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.475 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.475 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.476 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.476 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.476 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.476 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.476 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.476 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.476 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.477 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.477 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.477 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.477 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.477 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.478 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.478 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.478 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.478 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.478 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.479 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.479 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.479 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.479 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.479 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.479 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.480 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.480 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.480 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.480 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.480 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.481 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.481 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.481 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.481 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.481 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.482 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.482 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.482 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.482 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.482 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.482 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.483 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.483 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.483 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.483 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.483 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.483 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.483 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.484 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.484 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.484 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.484 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.484 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.484 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.484 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.485 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.485 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.485 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.485 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.485 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.485 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.486 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.486 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.486 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.486 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.486 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.486 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.487 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.487 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.487 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.487 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.487 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.487 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.488 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.488 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.488 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.488 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.488 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.488 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.488 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.489 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.489 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.490 226326 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Jan 26 10:01:22 compute-1 podman[226436]: 2026-01-26 10:01:22.498921 +0000 UTC m=+0.066627552 container create 3a6ae4dc2df3af4a6bf4e06b720b3b075b63d61c4adfe2b2c9d563de7c3cae7c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True)
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.513 226326 INFO nova.virt.node [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Determined node identity d06842a0-5d13-4573-bb78-d433bbb380e4 from /var/lib/nova/compute_id
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.514 226326 DEBUG nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.515 226326 DEBUG nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.515 226326 DEBUG nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.515 226326 DEBUG nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.528 226326 DEBUG nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7faaed35dee0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.531 226326 DEBUG nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7faaed35dee0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.531 226326 INFO nova.virt.libvirt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Connection event '1' reason 'None'
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.538 226326 INFO nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Libvirt host capabilities <capabilities>
Jan 26 10:01:22 compute-1 nova_compute[226322]: 
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <host>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <uuid>0657a708-098a-4137-a4d8-8ea25323424c</uuid>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <cpu>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <arch>x86_64</arch>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model>EPYC-Rome-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <vendor>AMD</vendor>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <microcode version='16777317'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <signature family='23' model='49' stepping='0'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <maxphysaddr mode='emulate' bits='40'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature name='x2apic'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature name='tsc-deadline'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature name='osxsave'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature name='hypervisor'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature name='tsc_adjust'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature name='spec-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature name='stibp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature name='arch-capabilities'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature name='ssbd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature name='cmp_legacy'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature name='topoext'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature name='virt-ssbd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature name='lbrv'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature name='tsc-scale'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature name='vmcb-clean'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature name='pause-filter'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature name='pfthreshold'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature name='svme-addr-chk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature name='rdctl-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature name='skip-l1dfl-vmentry'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature name='mds-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature name='pschange-mc-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <pages unit='KiB' size='4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <pages unit='KiB' size='2048'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <pages unit='KiB' size='1048576'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </cpu>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <power_management>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <suspend_mem/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </power_management>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <iommu support='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <migration_features>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <live/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <uri_transports>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <uri_transport>tcp</uri_transport>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <uri_transport>rdma</uri_transport>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </uri_transports>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </migration_features>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <topology>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <cells num='1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <cell id='0'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:           <memory unit='KiB'>7864304</memory>
Jan 26 10:01:22 compute-1 nova_compute[226322]:           <pages unit='KiB' size='4'>1966076</pages>
Jan 26 10:01:22 compute-1 nova_compute[226322]:           <pages unit='KiB' size='2048'>0</pages>
Jan 26 10:01:22 compute-1 nova_compute[226322]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 26 10:01:22 compute-1 nova_compute[226322]:           <distances>
Jan 26 10:01:22 compute-1 nova_compute[226322]:             <sibling id='0' value='10'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:           </distances>
Jan 26 10:01:22 compute-1 nova_compute[226322]:           <cpus num='8'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:           </cpus>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         </cell>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </cells>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </topology>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <cache>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </cache>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <secmodel>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model>selinux</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <doi>0</doi>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </secmodel>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <secmodel>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model>dac</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <doi>0</doi>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </secmodel>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   </host>
Jan 26 10:01:22 compute-1 nova_compute[226322]: 
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <guest>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <os_type>hvm</os_type>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <arch name='i686'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <wordsize>32</wordsize>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <domain type='qemu'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <domain type='kvm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </arch>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <features>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <pae/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <nonpae/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <acpi default='on' toggle='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <apic default='on' toggle='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <cpuselection/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <deviceboot/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <disksnapshot default='on' toggle='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <externalSnapshot/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </features>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   </guest>
Jan 26 10:01:22 compute-1 nova_compute[226322]: 
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <guest>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <os_type>hvm</os_type>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <arch name='x86_64'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <wordsize>64</wordsize>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <domain type='qemu'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <domain type='kvm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </arch>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <features>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <acpi default='on' toggle='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <apic default='on' toggle='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <cpuselection/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <deviceboot/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <disksnapshot default='on' toggle='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <externalSnapshot/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </features>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   </guest>
Jan 26 10:01:22 compute-1 nova_compute[226322]: 
Jan 26 10:01:22 compute-1 nova_compute[226322]: </capabilities>
Jan 26 10:01:22 compute-1 nova_compute[226322]: 
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.544 226326 DEBUG nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.546 226326 DEBUG nova.virt.libvirt.volume.mount [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.549 226326 DEBUG nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 26 10:01:22 compute-1 nova_compute[226322]: <domainCapabilities>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <path>/usr/libexec/qemu-kvm</path>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <domain>kvm</domain>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <arch>i686</arch>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <vcpu max='4096'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <iothreads supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <os supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <enum name='firmware'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <loader supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='type'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>rom</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>pflash</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='readonly'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>yes</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>no</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='secure'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>no</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </loader>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   </os>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <cpu>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <mode name='host-passthrough' supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='hostPassthroughMigratable'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>on</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>off</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </mode>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <mode name='maximum' supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='maximumMigratable'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>on</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>off</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </mode>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <mode name='host-model' supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <vendor>AMD</vendor>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='x2apic'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='tsc-deadline'/>
Jan 26 10:01:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93dfa11a05619004f732a88ff930df4934d43a692e19d27b8c03adb369fa54e2/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='hypervisor'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='tsc_adjust'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='spec-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='stibp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='ssbd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='cmp_legacy'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='overflow-recov'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='succor'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='ibrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='amd-ssbd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='virt-ssbd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='lbrv'/>
Jan 26 10:01:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93dfa11a05619004f732a88ff930df4934d43a692e19d27b8c03adb369fa54e2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 10:01:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93dfa11a05619004f732a88ff930df4934d43a692e19d27b8c03adb369fa54e2/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 10:01:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93dfa11a05619004f732a88ff930df4934d43a692e19d27b8c03adb369fa54e2/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='tsc-scale'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='vmcb-clean'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='flushbyasid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='pause-filter'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='pfthreshold'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='svme-addr-chk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='disable' name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </mode>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <mode name='custom' supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-noTSX'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server-v5'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='ClearwaterForest'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bhi-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cmpccxadd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ddpd-u'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='intel-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='lam'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchiti'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sha512'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sm3'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sm4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='ClearwaterForest-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bhi-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cmpccxadd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ddpd-u'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='intel-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='lam'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchiti'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sha512'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sm3'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sm4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cooperlake'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cooperlake-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cooperlake-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Denverton'>
Jan 26 10:01:22 compute-1 bash[226436]: 3a6ae4dc2df3af4a6bf4e06b720b3b075b63d61c4adfe2b2c9d563de7c3cae7c
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mpx'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Denverton-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mpx'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Denverton-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Denverton-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Dhyana-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Genoa'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='auto-ibrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Genoa-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='auto-ibrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Genoa-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='auto-ibrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fs-gs-base-ns'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='perfmon-v2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Milan'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Milan-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Milan-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Milan-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Rome'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Rome-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Rome-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Rome-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Turin'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='auto-ibrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vp2intersect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fs-gs-base-ns'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibpb-brtype'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='perfmon-v2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbpb'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='srso-user-kernel-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Turin-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='auto-ibrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vp2intersect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fs-gs-base-ns'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibpb-brtype'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='perfmon-v2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbpb'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='srso-user-kernel-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-v5'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='GraniteRapids'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchiti'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='GraniteRapids-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 podman[226436]: 2026-01-26 10:01:22.471028135 +0000 UTC m=+0.038734717 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchiti'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='GraniteRapids-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10-128'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10-256'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10-512'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 podman[226436]: 2026-01-26 10:01:22.571898772 +0000 UTC m=+0.139605344 container init 3a6ae4dc2df3af4a6bf4e06b720b3b075b63d61c4adfe2b2c9d563de7c3cae7c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchiti'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='GraniteRapids-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10-128'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10-256'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10-512'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchiti'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 podman[226436]: 2026-01-26 10:01:22.577202603 +0000 UTC m=+0.144909155 container start 3a6ae4dc2df3af4a6bf4e06b720b3b075b63d61c4adfe2b2c9d563de7c3cae7c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-noTSX'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-noTSX'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:22 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:22 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v5'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v6'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v7'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='IvyBridge'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='IvyBridge-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='IvyBridge-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='IvyBridge-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='KnightsMill'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-4fmaps'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-4vnniw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512er'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512pf'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='KnightsMill-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-4fmaps'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-4vnniw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512er'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512pf'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Opteron_G4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fma4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xop'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Opteron_G4-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fma4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xop'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Opteron_G5'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fma4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tbm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xop'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Opteron_G5-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fma4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tbm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xop'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SapphireRapids'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SapphireRapids-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SapphireRapids-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SapphireRapids-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SapphireRapids-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SierraForest'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cmpccxadd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SierraForest-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cmpccxadd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SierraForest-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cmpccxadd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='intel-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='lam'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SierraForest-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cmpccxadd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='intel-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='lam'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-v5'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Snowridge'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='core-capability'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mpx'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='split-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Snowridge-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='core-capability'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mpx'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='split-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Snowridge-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='core-capability'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='split-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Snowridge-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='core-capability'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='split-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Snowridge-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='athlon'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnow'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnowext'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='athlon-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnow'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnowext'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='core2duo'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='core2duo-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='coreduo'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='coreduo-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='n270'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='n270-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='phenom'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnow'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnowext'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='phenom-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnow'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnowext'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </mode>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   </cpu>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <memoryBacking supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <enum name='sourceType'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <value>file</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <value>anonymous</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <value>memfd</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   </memoryBacking>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <devices>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <disk supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='diskDevice'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>disk</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>cdrom</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>floppy</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>lun</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='bus'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>fdc</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>scsi</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>usb</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>sata</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='model'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio-transitional</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio-non-transitional</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </disk>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <graphics supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='type'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>vnc</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>egl-headless</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>dbus</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </graphics>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <video supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='modelType'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>vga</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>cirrus</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>none</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>bochs</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>ramfb</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </video>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <hostdev supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='mode'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>subsystem</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='startupPolicy'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>default</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>mandatory</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>requisite</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>optional</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='subsysType'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>usb</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>pci</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>scsi</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='capsType'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='pciBackend'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </hostdev>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <rng supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='model'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio-transitional</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio-non-transitional</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='backendModel'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>random</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>egd</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>builtin</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </rng>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <filesystem supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='driverType'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>path</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>handle</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtiofs</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </filesystem>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <tpm supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='model'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>tpm-tis</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>tpm-crb</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='backendModel'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>emulator</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>external</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='backendVersion'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>2.0</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </tpm>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <redirdev supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='bus'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>usb</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </redirdev>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <channel supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='type'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>pty</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>unix</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </channel>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <crypto supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='model'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='type'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>qemu</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='backendModel'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>builtin</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </crypto>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <interface supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='backendType'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>default</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>passt</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </interface>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <panic supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='model'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>isa</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>hyperv</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </panic>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <console supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='type'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>null</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>vc</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>pty</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>dev</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>file</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>pipe</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>stdio</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>udp</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>tcp</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>unix</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>qemu-vdagent</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>dbus</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </console>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   </devices>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <features>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <gic supported='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <vmcoreinfo supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <genid supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <backingStoreInput supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <backup supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <async-teardown supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <s390-pv supported='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <ps2 supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <tdx supported='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <sev supported='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <sgx supported='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <hyperv supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='features'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>relaxed</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>vapic</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>spinlocks</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>vpindex</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>runtime</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>synic</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>stimer</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>reset</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>vendor_id</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>frequencies</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>reenlightenment</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>tlbflush</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>ipi</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>avic</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>emsr_bitmap</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>xmm_input</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <defaults>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <spinlocks>4095</spinlocks>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <stimer_direct>on</stimer_direct>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <tlbflush_direct>on</tlbflush_direct>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <tlbflush_extended>on</tlbflush_extended>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </defaults>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </hyperv>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <launchSecurity supported='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   </features>
Jan 26 10:01:22 compute-1 nova_compute[226322]: </domainCapabilities>
Jan 26 10:01:22 compute-1 nova_compute[226322]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.561 226326 DEBUG nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 26 10:01:22 compute-1 nova_compute[226322]: <domainCapabilities>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <path>/usr/libexec/qemu-kvm</path>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <domain>kvm</domain>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <arch>i686</arch>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <vcpu max='240'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <iothreads supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <os supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <enum name='firmware'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <loader supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='type'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>rom</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>pflash</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='readonly'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>yes</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>no</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='secure'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>no</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </loader>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   </os>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <cpu>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <mode name='host-passthrough' supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='hostPassthroughMigratable'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>on</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>off</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </mode>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <mode name='maximum' supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='maximumMigratable'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>on</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>off</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </mode>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <mode name='host-model' supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <vendor>AMD</vendor>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='x2apic'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='tsc-deadline'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='hypervisor'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='tsc_adjust'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='spec-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='stibp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='ssbd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='cmp_legacy'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='overflow-recov'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='succor'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='ibrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='amd-ssbd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='virt-ssbd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='lbrv'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='tsc-scale'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='vmcb-clean'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='flushbyasid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='pause-filter'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='pfthreshold'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='svme-addr-chk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='disable' name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </mode>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <mode name='custom' supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-noTSX'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server-v5'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='ClearwaterForest'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bhi-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cmpccxadd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ddpd-u'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='intel-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='lam'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchiti'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sha512'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sm3'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sm4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='ClearwaterForest-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bhi-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cmpccxadd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ddpd-u'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='intel-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='lam'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchiti'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sha512'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sm3'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sm4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cooperlake'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cooperlake-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cooperlake-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Denverton'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mpx'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Denverton-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mpx'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Denverton-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Denverton-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Dhyana-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Genoa'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='auto-ibrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Genoa-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='auto-ibrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Genoa-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='auto-ibrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fs-gs-base-ns'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='perfmon-v2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Milan'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Milan-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Milan-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Milan-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Rome'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Rome-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Rome-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Rome-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Turin'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='auto-ibrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vp2intersect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fs-gs-base-ns'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibpb-brtype'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='perfmon-v2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbpb'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='srso-user-kernel-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Turin-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='auto-ibrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vp2intersect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fs-gs-base-ns'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibpb-brtype'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='perfmon-v2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbpb'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='srso-user-kernel-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-v5'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='GraniteRapids'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchiti'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='GraniteRapids-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchiti'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='GraniteRapids-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10-128'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10-256'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10-512'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchiti'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='GraniteRapids-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10-128'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10-256'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10-512'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchiti'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-noTSX'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-noTSX'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v5'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v6'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v7'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='IvyBridge'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='IvyBridge-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='IvyBridge-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='IvyBridge-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='KnightsMill'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-4fmaps'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-4vnniw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512er'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512pf'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='KnightsMill-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-4fmaps'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-4vnniw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512er'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512pf'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Opteron_G4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fma4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xop'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Opteron_G4-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fma4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xop'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Opteron_G5'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fma4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tbm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xop'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Opteron_G5-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fma4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tbm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xop'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SapphireRapids'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SapphireRapids-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SapphireRapids-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SapphireRapids-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SapphireRapids-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SierraForest'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cmpccxadd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SierraForest-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cmpccxadd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SierraForest-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cmpccxadd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='intel-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='lam'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SierraForest-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cmpccxadd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='intel-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='lam'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-v5'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Snowridge'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='core-capability'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mpx'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='split-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Snowridge-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='core-capability'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mpx'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='split-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Snowridge-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='core-capability'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='split-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Snowridge-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='core-capability'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='split-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Snowridge-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='athlon'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnow'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnowext'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='athlon-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnow'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnowext'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='core2duo'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='core2duo-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='coreduo'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='coreduo-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='n270'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='n270-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='phenom'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnow'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnowext'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='phenom-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnow'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnowext'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </mode>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   </cpu>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <memoryBacking supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <enum name='sourceType'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <value>file</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <value>anonymous</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <value>memfd</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   </memoryBacking>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <devices>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <disk supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='diskDevice'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>disk</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>cdrom</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>floppy</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>lun</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='bus'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>ide</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>fdc</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>scsi</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>usb</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>sata</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='model'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio-transitional</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio-non-transitional</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </disk>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <graphics supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='type'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>vnc</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>egl-headless</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>dbus</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </graphics>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <video supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='modelType'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>vga</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>cirrus</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>none</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>bochs</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>ramfb</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </video>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <hostdev supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='mode'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>subsystem</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='startupPolicy'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>default</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>mandatory</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>requisite</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>optional</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='subsysType'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>usb</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>pci</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>scsi</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='capsType'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='pciBackend'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </hostdev>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <rng supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='model'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio-transitional</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio-non-transitional</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='backendModel'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>random</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>egd</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>builtin</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </rng>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <filesystem supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='driverType'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>path</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>handle</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtiofs</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </filesystem>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <tpm supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='model'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>tpm-tis</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>tpm-crb</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='backendModel'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>emulator</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>external</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='backendVersion'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>2.0</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </tpm>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <redirdev supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='bus'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>usb</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </redirdev>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <channel supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='type'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>pty</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>unix</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </channel>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <crypto supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='model'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='type'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>qemu</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='backendModel'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>builtin</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </crypto>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <interface supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='backendType'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>default</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>passt</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </interface>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <panic supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='model'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>isa</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>hyperv</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </panic>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <console supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='type'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>null</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>vc</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>pty</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>dev</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>file</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>pipe</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>stdio</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>udp</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>tcp</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>unix</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>qemu-vdagent</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>dbus</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </console>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   </devices>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <features>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <gic supported='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <vmcoreinfo supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <genid supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <backingStoreInput supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <backup supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <async-teardown supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <s390-pv supported='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <ps2 supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <tdx supported='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <sev supported='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <sgx supported='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <hyperv supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='features'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>relaxed</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>vapic</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>spinlocks</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>vpindex</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>runtime</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>synic</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>stimer</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>reset</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>vendor_id</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>frequencies</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>reenlightenment</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>tlbflush</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>ipi</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>avic</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>emsr_bitmap</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>xmm_input</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <defaults>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <spinlocks>4095</spinlocks>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <stimer_direct>on</stimer_direct>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <tlbflush_direct>on</tlbflush_direct>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <tlbflush_extended>on</tlbflush_extended>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </defaults>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </hyperv>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <launchSecurity supported='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   </features>
Jan 26 10:01:22 compute-1 nova_compute[226322]: </domainCapabilities>
Jan 26 10:01:22 compute-1 nova_compute[226322]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.644 226326 DEBUG nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.647 226326 DEBUG nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 26 10:01:22 compute-1 nova_compute[226322]: <domainCapabilities>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <path>/usr/libexec/qemu-kvm</path>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <domain>kvm</domain>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <arch>x86_64</arch>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <vcpu max='4096'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <iothreads supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <os supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <enum name='firmware'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <value>efi</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <loader supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='type'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>rom</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>pflash</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='readonly'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>yes</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>no</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='secure'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>yes</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>no</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </loader>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   </os>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <cpu>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <mode name='host-passthrough' supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='hostPassthroughMigratable'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>on</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>off</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </mode>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <mode name='maximum' supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='maximumMigratable'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>on</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>off</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </mode>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <mode name='host-model' supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <vendor>AMD</vendor>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='x2apic'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='tsc-deadline'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='hypervisor'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='tsc_adjust'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='spec-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='stibp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='ssbd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='cmp_legacy'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='overflow-recov'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='succor'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='ibrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='amd-ssbd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='virt-ssbd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='lbrv'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='tsc-scale'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='vmcb-clean'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='flushbyasid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='pause-filter'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='pfthreshold'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='svme-addr-chk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='disable' name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </mode>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <mode name='custom' supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-noTSX'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server-v5'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='ClearwaterForest'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bhi-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cmpccxadd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ddpd-u'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='intel-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='lam'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchiti'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sha512'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sm3'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sm4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='ClearwaterForest-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bhi-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cmpccxadd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ddpd-u'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='intel-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='lam'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchiti'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sha512'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sm3'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sm4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cooperlake'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cooperlake-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cooperlake-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Denverton'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mpx'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Denverton-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mpx'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Denverton-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Denverton-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Dhyana-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Genoa'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='auto-ibrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Genoa-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='auto-ibrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Genoa-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='auto-ibrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fs-gs-base-ns'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='perfmon-v2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Milan'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Milan-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Milan-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Milan-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Rome'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Rome-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Rome-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Rome-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Turin'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='auto-ibrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vp2intersect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fs-gs-base-ns'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibpb-brtype'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='perfmon-v2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbpb'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='srso-user-kernel-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Turin-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='auto-ibrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vp2intersect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fs-gs-base-ns'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibpb-brtype'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='perfmon-v2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbpb'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='srso-user-kernel-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-v5'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='GraniteRapids'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchiti'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='GraniteRapids-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchiti'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='GraniteRapids-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10-128'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10-256'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10-512'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchiti'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='GraniteRapids-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10-128'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10-256'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10-512'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchiti'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-noTSX'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-noTSX'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v5'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v6'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v7'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='IvyBridge'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='IvyBridge-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='IvyBridge-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='IvyBridge-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='KnightsMill'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-4fmaps'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-4vnniw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512er'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512pf'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='KnightsMill-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-4fmaps'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-4vnniw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512er'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512pf'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Opteron_G4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fma4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xop'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Opteron_G4-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fma4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xop'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Opteron_G5'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fma4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tbm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xop'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Opteron_G5-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fma4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tbm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xop'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SapphireRapids'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SapphireRapids-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SapphireRapids-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SapphireRapids-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SapphireRapids-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SierraForest'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cmpccxadd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SierraForest-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cmpccxadd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SierraForest-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cmpccxadd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='intel-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='lam'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SierraForest-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cmpccxadd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='intel-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='lam'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-v5'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Snowridge'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='core-capability'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mpx'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='split-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Snowridge-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='core-capability'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mpx'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='split-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Snowridge-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='core-capability'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='split-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Snowridge-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='core-capability'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='split-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Snowridge-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='athlon'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnow'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnowext'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='athlon-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnow'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnowext'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='core2duo'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='core2duo-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='coreduo'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='coreduo-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='n270'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='n270-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='phenom'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnow'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnowext'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='phenom-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnow'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnowext'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </mode>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   </cpu>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <memoryBacking supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <enum name='sourceType'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <value>file</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <value>anonymous</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <value>memfd</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   </memoryBacking>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <devices>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <disk supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='diskDevice'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>disk</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>cdrom</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>floppy</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>lun</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='bus'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>fdc</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>scsi</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>usb</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>sata</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='model'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio-transitional</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio-non-transitional</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </disk>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <graphics supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='type'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>vnc</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>egl-headless</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>dbus</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </graphics>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <video supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='modelType'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>vga</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>cirrus</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>none</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>bochs</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>ramfb</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </video>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <hostdev supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='mode'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>subsystem</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='startupPolicy'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>default</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>mandatory</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>requisite</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>optional</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='subsysType'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>usb</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>pci</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>scsi</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='capsType'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='pciBackend'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </hostdev>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <rng supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='model'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio-transitional</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio-non-transitional</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='backendModel'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>random</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>egd</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>builtin</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </rng>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <filesystem supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='driverType'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>path</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>handle</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtiofs</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </filesystem>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <tpm supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='model'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>tpm-tis</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>tpm-crb</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='backendModel'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>emulator</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>external</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='backendVersion'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>2.0</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </tpm>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <redirdev supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='bus'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>usb</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </redirdev>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <channel supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='type'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>pty</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>unix</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </channel>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <crypto supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='model'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='type'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>qemu</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='backendModel'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>builtin</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </crypto>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <interface supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='backendType'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>default</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>passt</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </interface>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <panic supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='model'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>isa</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>hyperv</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </panic>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <console supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='type'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>null</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>vc</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>pty</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>dev</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>file</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>pipe</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>stdio</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>udp</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>tcp</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>unix</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>qemu-vdagent</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>dbus</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </console>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   </devices>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <features>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <gic supported='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <vmcoreinfo supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <genid supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <backingStoreInput supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <backup supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <async-teardown supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <s390-pv supported='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <ps2 supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <tdx supported='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <sev supported='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <sgx supported='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <hyperv supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='features'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>relaxed</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>vapic</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>spinlocks</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>vpindex</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>runtime</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>synic</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>stimer</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>reset</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>vendor_id</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>frequencies</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>reenlightenment</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>tlbflush</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>ipi</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>avic</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>emsr_bitmap</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>xmm_input</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <defaults>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <spinlocks>4095</spinlocks>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <stimer_direct>on</stimer_direct>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <tlbflush_direct>on</tlbflush_direct>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <tlbflush_extended>on</tlbflush_extended>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </defaults>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </hyperv>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <launchSecurity supported='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   </features>
Jan 26 10:01:22 compute-1 nova_compute[226322]: </domainCapabilities>
Jan 26 10:01:22 compute-1 nova_compute[226322]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.728 226326 DEBUG nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 26 10:01:22 compute-1 nova_compute[226322]: <domainCapabilities>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <path>/usr/libexec/qemu-kvm</path>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <domain>kvm</domain>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <arch>x86_64</arch>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <vcpu max='240'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <iothreads supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <os supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <enum name='firmware'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <loader supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='type'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>rom</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>pflash</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='readonly'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>yes</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>no</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='secure'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>no</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </loader>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   </os>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <cpu>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <mode name='host-passthrough' supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='hostPassthroughMigratable'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>on</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>off</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </mode>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <mode name='maximum' supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='maximumMigratable'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>on</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>off</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </mode>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <mode name='host-model' supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <vendor>AMD</vendor>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='x2apic'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='tsc-deadline'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='hypervisor'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='tsc_adjust'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='spec-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='stibp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='ssbd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='cmp_legacy'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='overflow-recov'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='succor'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='ibrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='amd-ssbd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='virt-ssbd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='lbrv'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='tsc-scale'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='vmcb-clean'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='flushbyasid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='pause-filter'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='pfthreshold'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='svme-addr-chk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <feature policy='disable' name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </mode>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <mode name='custom' supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-noTSX'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Broadwell-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cascadelake-Server-v5'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='ClearwaterForest'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bhi-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cmpccxadd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ddpd-u'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='intel-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='lam'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchiti'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sha512'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sm3'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sm4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='ClearwaterForest-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bhi-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cmpccxadd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ddpd-u'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='intel-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='lam'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchiti'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sha512'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sm3'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sm4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cooperlake'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cooperlake-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Cooperlake-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Denverton'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mpx'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Denverton-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mpx'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Denverton-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Denverton-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Dhyana-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Genoa'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='auto-ibrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Genoa-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='auto-ibrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Genoa-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='auto-ibrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fs-gs-base-ns'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='perfmon-v2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Milan'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Milan-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Milan-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Milan-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Rome'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Rome-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Rome-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Rome-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Turin'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='auto-ibrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vp2intersect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fs-gs-base-ns'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibpb-brtype'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='perfmon-v2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbpb'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='srso-user-kernel-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-Turin-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amd-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='auto-ibrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vp2intersect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fs-gs-base-ns'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibpb-brtype'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='no-nested-data-bp'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='null-sel-clr-base'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='perfmon-v2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbpb'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='srso-user-kernel-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='stibp-always-on'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='EPYC-v5'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='GraniteRapids'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchiti'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='GraniteRapids-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchiti'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='GraniteRapids-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10-128'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10-256'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10-512'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchiti'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='GraniteRapids-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10-128'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10-256'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx10-512'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='prefetchiti'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-noTSX'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Haswell-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-noTSX'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v5'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v6'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Icelake-Server-v7'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='IvyBridge'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='IvyBridge-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='IvyBridge-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='IvyBridge-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='KnightsMill'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-4fmaps'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-4vnniw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512er'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512pf'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='KnightsMill-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-4fmaps'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-4vnniw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512er'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512pf'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Opteron_G4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fma4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xop'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Opteron_G4-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fma4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xop'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Opteron_G5'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fma4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tbm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xop'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Opteron_G5-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fma4'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tbm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xop'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SapphireRapids'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SapphireRapids-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SapphireRapids-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SapphireRapids-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SapphireRapids-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='amx-tile'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-bf16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-fp16'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512-vpopcntdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bitalg'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vbmi2'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrc'/>
Jan 26 10:01:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:22 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:22 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fzrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='la57'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='taa-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='tsx-ldtrk'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:22 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SierraForest'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cmpccxadd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SierraForest-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:22 compute-1 ceph-mon[80107]: pgmap v566: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:22 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cmpccxadd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SierraForest-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cmpccxadd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='intel-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='lam'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='SierraForest-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ifma'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-ne-convert'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx-vnni-int8'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bhi-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='bus-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cmpccxadd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fbsdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='fsrs'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ibrs-all'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='intel-psfd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ipred-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='lam'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mcdt-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pbrsb-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='psdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rrsba-ctrl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='sbdr-ssdp-no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='serialize'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vaes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='vpclmulqdq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Client-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='hle'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='rtm'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:22 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Skylake-Server-v5'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512bw'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512cd'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512dq'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512f'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='avx512vl'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='invpcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pcid'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='pku'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Snowridge'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='core-capability'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mpx'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='split-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Snowridge-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='core-capability'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='mpx'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='split-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Snowridge-v2'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='core-capability'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='split-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Snowridge-v3'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='core-capability'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='split-lock-detect'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='Snowridge-v4'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='cldemote'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='erms'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='gfni'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdir64b'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='movdiri'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='xsaves'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='athlon'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnow'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnowext'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='athlon-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnow'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnowext'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='core2duo'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='core2duo-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='coreduo'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='coreduo-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='n270'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='n270-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='ss'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='phenom'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnow'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnowext'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <blockers model='phenom-v1'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnow'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <feature name='3dnowext'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </blockers>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </mode>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   </cpu>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <memoryBacking supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <enum name='sourceType'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <value>file</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <value>anonymous</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <value>memfd</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   </memoryBacking>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <devices>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <disk supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='diskDevice'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>disk</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>cdrom</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>floppy</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>lun</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='bus'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>ide</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>fdc</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>scsi</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>usb</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>sata</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='model'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio-transitional</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio-non-transitional</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </disk>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <graphics supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='type'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>vnc</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>egl-headless</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>dbus</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </graphics>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <video supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='modelType'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>vga</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>cirrus</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>none</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>bochs</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>ramfb</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </video>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <hostdev supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='mode'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>subsystem</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='startupPolicy'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>default</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>mandatory</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>requisite</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>optional</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='subsysType'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>usb</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>pci</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>scsi</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='capsType'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='pciBackend'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </hostdev>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <rng supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='model'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio-transitional</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtio-non-transitional</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='backendModel'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>random</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>egd</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>builtin</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </rng>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <filesystem supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='driverType'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>path</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>handle</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>virtiofs</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </filesystem>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <tpm supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='model'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>tpm-tis</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>tpm-crb</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='backendModel'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>emulator</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>external</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='backendVersion'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>2.0</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </tpm>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <redirdev supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='bus'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>usb</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </redirdev>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <channel supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='type'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>pty</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>unix</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </channel>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <crypto supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='model'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='type'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>qemu</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='backendModel'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>builtin</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </crypto>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <interface supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='backendType'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>default</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>passt</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </interface>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <panic supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='model'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>isa</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>hyperv</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </panic>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <console supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='type'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>null</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>vc</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>pty</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>dev</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>file</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>pipe</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>stdio</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>udp</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>tcp</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>unix</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>qemu-vdagent</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>dbus</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </console>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   </devices>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   <features>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <gic supported='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <vmcoreinfo supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <genid supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <backingStoreInput supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <backup supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <async-teardown supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <s390-pv supported='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <ps2 supported='yes'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <tdx supported='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <sev supported='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <sgx supported='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <hyperv supported='yes'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <enum name='features'>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>relaxed</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>vapic</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>spinlocks</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>vpindex</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>runtime</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>synic</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>stimer</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>reset</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>vendor_id</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>frequencies</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>reenlightenment</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>tlbflush</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>ipi</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>avic</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>emsr_bitmap</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <value>xmm_input</value>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </enum>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       <defaults>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <spinlocks>4095</spinlocks>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <stimer_direct>on</stimer_direct>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <tlbflush_direct>on</tlbflush_direct>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <tlbflush_extended>on</tlbflush_extended>
Jan 26 10:01:22 compute-1 nova_compute[226322]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 10:01:22 compute-1 nova_compute[226322]:       </defaults>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     </hyperv>
Jan 26 10:01:22 compute-1 nova_compute[226322]:     <launchSecurity supported='no'/>
Jan 26 10:01:22 compute-1 nova_compute[226322]:   </features>
Jan 26 10:01:22 compute-1 nova_compute[226322]: </domainCapabilities>
Jan 26 10:01:22 compute-1 nova_compute[226322]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.803 226326 DEBUG nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.804 226326 INFO nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Secure Boot support detected
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.806 226326 INFO nova.virt.libvirt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.806 226326 INFO nova.virt.libvirt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.819 226326 DEBUG nova.virt.libvirt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.840 226326 INFO nova.virt.node [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Determined node identity d06842a0-5d13-4573-bb78-d433bbb380e4 from /var/lib/nova/compute_id
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.863 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Verified node d06842a0-5d13-4573-bb78-d433bbb380e4 matches my host compute-1.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Jan 26 10:01:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:22 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.904 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 26 10:01:22 compute-1 nova_compute[226322]: 2026-01-26 10:01:22.993 226326 ERROR nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Could not retrieve compute node resource provider d06842a0-5d13-4573-bb78-d433bbb380e4 and therefore unable to error out any instances stuck in BUILDING state. Error: Failed to retrieve allocations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider 'd06842a0-5d13-4573-bb78-d433bbb380e4' not found: No resource provider with uuid d06842a0-5d13-4573-bb78-d433bbb380e4 found  ", "request_id": "req-80151d17-4a2f-4040-8a9d-3cf1f4c18fd5"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider 'd06842a0-5d13-4573-bb78-d433bbb380e4' not found: No resource provider with uuid d06842a0-5d13-4573-bb78-d433bbb380e4 found  ", "request_id": "req-80151d17-4a2f-4040-8a9d-3cf1f4c18fd5"}]}
Jan 26 10:01:22 compute-1 rsyslogd[1005]: imjournal from <np0005595445:nova_compute>: begin to drop messages due to rate-limiting
Jan 26 10:01:23 compute-1 nova_compute[226322]: 2026-01-26 10:01:23.013 226326 DEBUG oslo_concurrency.lockutils [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:01:23 compute-1 nova_compute[226322]: 2026-01-26 10:01:23.013 226326 DEBUG oslo_concurrency.lockutils [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:01:23 compute-1 nova_compute[226322]: 2026-01-26 10:01:23.014 226326 DEBUG oslo_concurrency.lockutils [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:01:23 compute-1 nova_compute[226322]: 2026-01-26 10:01:23.014 226326 DEBUG nova.compute.resource_tracker [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:01:23 compute-1 nova_compute[226322]: 2026-01-26 10:01:23.014 226326 DEBUG oslo_concurrency.processutils [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:01:23 compute-1 sudo[226659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emcfkqmcbwardcmqvflkknrycghkxpwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769421683.1404326-3618-8635250991761/AnsiballZ_podman_container.py'
Jan 26 10:01:23 compute-1 sudo[226659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:01:23 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:01:23 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/504650738' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:01:23 compute-1 nova_compute[226322]: 2026-01-26 10:01:23.481 226326 DEBUG oslo_concurrency.processutils [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:01:23 compute-1 nova_compute[226322]: 2026-01-26 10:01:23.646 226326 WARNING nova.virt.libvirt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:01:23 compute-1 nova_compute[226322]: 2026-01-26 10:01:23.647 226326 DEBUG nova.compute.resource_tracker [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5264MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:01:23 compute-1 nova_compute[226322]: 2026-01-26 10:01:23.647 226326 DEBUG oslo_concurrency.lockutils [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:01:23 compute-1 nova_compute[226322]: 2026-01-26 10:01:23.648 226326 DEBUG oslo_concurrency.lockutils [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:01:23 compute-1 python3.9[226661]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 26 10:01:23 compute-1 nova_compute[226322]: 2026-01-26 10:01:23.762 226326 ERROR nova.compute.resource_tracker [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Skipping removal of allocations for deleted instances: Failed to retrieve allocations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider 'd06842a0-5d13-4573-bb78-d433bbb380e4' not found: No resource provider with uuid d06842a0-5d13-4573-bb78-d433bbb380e4 found  ", "request_id": "req-44f8389d-e9d3-4c32-814e-9d6c09ca32d3"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider 'd06842a0-5d13-4573-bb78-d433bbb380e4' not found: No resource provider with uuid d06842a0-5d13-4573-bb78-d433bbb380e4 found  ", "request_id": "req-44f8389d-e9d3-4c32-814e-9d6c09ca32d3"}]}
Jan 26 10:01:23 compute-1 nova_compute[226322]: 2026-01-26 10:01:23.763 226326 DEBUG nova.compute.resource_tracker [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:01:23 compute-1 nova_compute[226322]: 2026-01-26 10:01:23.763 226326 DEBUG nova.compute.resource_tracker [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:01:23 compute-1 nova_compute[226322]: 2026-01-26 10:01:23.905 226326 INFO nova.scheduler.client.report [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [req-1c7cf3a9-f24c-4b76-9a02-1bf5ff516df4] Created resource provider record via placement API for resource provider with UUID d06842a0-5d13-4573-bb78-d433bbb380e4 and name compute-1.ctlplane.example.com.
Jan 26 10:01:23 compute-1 nova_compute[226322]: 2026-01-26 10:01:23.919 226326 DEBUG oslo_concurrency.processutils [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:01:23 compute-1 systemd[1]: Started libpod-conmon-f8473b9a2106532a3892fa37e60a6e60a0eb71b6d886197e161aaec5fb0ab5d9.scope.
Jan 26 10:01:23 compute-1 systemd[1]: Started libcrun container.
Jan 26 10:01:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd4203076b2ca5f2c7fe52b6eb8695587df649831b018a961d32162a2cbcde6c/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 26 10:01:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd4203076b2ca5f2c7fe52b6eb8695587df649831b018a961d32162a2cbcde6c/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 26 10:01:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd4203076b2ca5f2c7fe52b6eb8695587df649831b018a961d32162a2cbcde6c/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 26 10:01:24 compute-1 podman[226689]: 2026-01-26 10:01:24.01102454 +0000 UTC m=+0.154430906 container init f8473b9a2106532a3892fa37e60a6e60a0eb71b6d886197e161aaec5fb0ab5d9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, container_name=nova_compute_init, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 10:01:24 compute-1 podman[226689]: 2026-01-26 10:01:24.020500839 +0000 UTC m=+0.163907165 container start f8473b9a2106532a3892fa37e60a6e60a0eb71b6d886197e161aaec5fb0ab5d9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute_init)
Jan 26 10:01:24 compute-1 python3.9[226661]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 26 10:01:24 compute-1 nova_compute_init[226721]: INFO:nova_statedir:Applying nova statedir ownership
Jan 26 10:01:24 compute-1 nova_compute_init[226721]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 26 10:01:24 compute-1 nova_compute_init[226721]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 26 10:01:24 compute-1 nova_compute_init[226721]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 26 10:01:24 compute-1 nova_compute_init[226721]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 26 10:01:24 compute-1 nova_compute_init[226721]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 26 10:01:24 compute-1 nova_compute_init[226721]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 26 10:01:24 compute-1 nova_compute_init[226721]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 26 10:01:24 compute-1 nova_compute_init[226721]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 26 10:01:24 compute-1 nova_compute_init[226721]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 26 10:01:24 compute-1 nova_compute_init[226721]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 26 10:01:24 compute-1 nova_compute_init[226721]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 26 10:01:24 compute-1 nova_compute_init[226721]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 26 10:01:24 compute-1 nova_compute_init[226721]: INFO:nova_statedir:Nova statedir ownership complete
Jan 26 10:01:24 compute-1 systemd[1]: libpod-f8473b9a2106532a3892fa37e60a6e60a0eb71b6d886197e161aaec5fb0ab5d9.scope: Deactivated successfully.
Jan 26 10:01:24 compute-1 conmon[226705]: conmon f8473b9a2106532a3892 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-f8473b9a2106532a3892fa37e60a6e60a0eb71b6d886197e161aaec5fb0ab5d9.scope/container/memory.events
Jan 26 10:01:24 compute-1 podman[226742]: 2026-01-26 10:01:24.127609372 +0000 UTC m=+0.025753275 container died f8473b9a2106532a3892fa37e60a6e60a0eb71b6d886197e161aaec5fb0ab5d9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 26 10:01:24 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f8473b9a2106532a3892fa37e60a6e60a0eb71b6d886197e161aaec5fb0ab5d9-userdata-shm.mount: Deactivated successfully.
Jan 26 10:01:24 compute-1 systemd[1]: var-lib-containers-storage-overlay-bd4203076b2ca5f2c7fe52b6eb8695587df649831b018a961d32162a2cbcde6c-merged.mount: Deactivated successfully.
Jan 26 10:01:24 compute-1 podman[226742]: 2026-01-26 10:01:24.177352065 +0000 UTC m=+0.075495958 container cleanup f8473b9a2106532a3892fa37e60a6e60a0eb71b6d886197e161aaec5fb0ab5d9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Jan 26 10:01:24 compute-1 systemd[1]: libpod-conmon-f8473b9a2106532a3892fa37e60a6e60a0eb71b6d886197e161aaec5fb0ab5d9.scope: Deactivated successfully.
Jan 26 10:01:24 compute-1 sudo[226659]: pam_unix(sudo:session): session closed for user root
Jan 26 10:01:24 compute-1 ceph-mon[80107]: pgmap v567: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 10:01:24 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/504650738' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:01:24 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/472112996' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:01:24 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:01:24 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/392915151' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:01:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:01:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:24.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000037s ======
Jan 26 10:01:24 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:24.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000037s
Jan 26 10:01:24 compute-1 nova_compute[226322]: 2026-01-26 10:01:24.375 226326 DEBUG oslo_concurrency.processutils [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:01:24 compute-1 nova_compute[226322]: 2026-01-26 10:01:24.381 226326 DEBUG nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 26 10:01:24 compute-1 nova_compute[226322]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Jan 26 10:01:24 compute-1 nova_compute[226322]: 2026-01-26 10:01:24.381 226326 INFO nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] kernel doesn't support AMD SEV
Jan 26 10:01:24 compute-1 nova_compute[226322]: 2026-01-26 10:01:24.382 226326 DEBUG nova.compute.provider_tree [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Updating inventory in ProviderTree for provider d06842a0-5d13-4573-bb78-d433bbb380e4 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 10:01:24 compute-1 nova_compute[226322]: 2026-01-26 10:01:24.383 226326 DEBUG nova.virt.libvirt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 26 10:01:24 compute-1 nova_compute[226322]: 2026-01-26 10:01:24.424 226326 DEBUG nova.scheduler.client.report [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Updated inventory for provider d06842a0-5d13-4573-bb78-d433bbb380e4 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Jan 26 10:01:24 compute-1 nova_compute[226322]: 2026-01-26 10:01:24.425 226326 DEBUG nova.compute.provider_tree [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Updating resource provider d06842a0-5d13-4573-bb78-d433bbb380e4 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 26 10:01:24 compute-1 nova_compute[226322]: 2026-01-26 10:01:24.425 226326 DEBUG nova.compute.provider_tree [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Updating inventory in ProviderTree for provider d06842a0-5d13-4573-bb78-d433bbb380e4 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 10:01:24 compute-1 nova_compute[226322]: 2026-01-26 10:01:24.520 226326 DEBUG nova.compute.provider_tree [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Updating resource provider d06842a0-5d13-4573-bb78-d433bbb380e4 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 26 10:01:24 compute-1 nova_compute[226322]: 2026-01-26 10:01:24.543 226326 DEBUG nova.compute.resource_tracker [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:01:24 compute-1 nova_compute[226322]: 2026-01-26 10:01:24.543 226326 DEBUG oslo_concurrency.lockutils [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.896s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:01:24 compute-1 nova_compute[226322]: 2026-01-26 10:01:24.544 226326 DEBUG nova.service [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Jan 26 10:01:24 compute-1 nova_compute[226322]: 2026-01-26 10:01:24.642 226326 DEBUG nova.service [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Jan 26 10:01:24 compute-1 nova_compute[226322]: 2026-01-26 10:01:24.642 226326 DEBUG nova.servicegroup.drivers.db [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Jan 26 10:01:25 compute-1 sshd-session[202121]: Connection closed by 192.168.122.30 port 57050
Jan 26 10:01:25 compute-1 sshd-session[202106]: pam_unix(sshd:session): session closed for user zuul
Jan 26 10:01:25 compute-1 systemd[1]: session-53.scope: Deactivated successfully.
Jan 26 10:01:25 compute-1 systemd[1]: session-53.scope: Consumed 2min 5.826s CPU time.
Jan 26 10:01:25 compute-1 systemd-logind[783]: Session 53 logged out. Waiting for processes to exit.
Jan 26 10:01:25 compute-1 systemd-logind[783]: Removed session 53.
Jan 26 10:01:25 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/392915151' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:01:25 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/729255989' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:01:26 compute-1 ceph-mon[80107]: pgmap v568: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 85 B/s wr, 0 op/s
Jan 26 10:01:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:26.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:01:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000038s ======
Jan 26 10:01:26 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:26.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000038s
Jan 26 10:01:27 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/512875373' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:01:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:01:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:01:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:28 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:28.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000038s ======
Jan 26 10:01:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:28.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000038s
Jan 26 10:01:28 compute-1 ceph-mon[80107]: pgmap v569: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 26 10:01:28 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1208342742' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:01:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:28 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:01:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:28 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:01:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000038s ======
Jan 26 10:01:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:30.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000038s
Jan 26 10:01:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:30.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:30 compute-1 ceph-mon[80107]: pgmap v570: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 26 10:01:32 compute-1 ceph-mon[80107]: pgmap v571: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 853 B/s wr, 2 op/s
Jan 26 10:01:32 compute-1 podman[226797]: 2026-01-26 10:01:32.323128213 +0000 UTC m=+0.086903600 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 26 10:01:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:32.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000038s ======
Jan 26 10:01:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:32.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000038s
Jan 26 10:01:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100132 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 10:01:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:01:34 compute-1 sshd-session[226816]: Connection closed by authenticating user root 152.42.136.110 port 37888 [preauth]
Jan 26 10:01:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000038s ======
Jan 26 10:01:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:34.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000038s
Jan 26 10:01:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:34.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:34 compute-1 ceph-mon[80107]: pgmap v572: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 853 B/s wr, 2 op/s
Jan 26 10:01:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 26 10:01:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 10:01:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:35 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b64000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:35 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58001c40 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:36 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000037s ======
Jan 26 10:01:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:36.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000037s
Jan 26 10:01:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:01:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000038s ======
Jan 26 10:01:36 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:36.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000038s
Jan 26 10:01:36 compute-1 ceph-mon[80107]: pgmap v573: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 2 op/s
Jan 26 10:01:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100137 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 10:01:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:37 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:37 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b64001bd0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:01:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:38 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:01:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000038s ======
Jan 26 10:01:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:38.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000038s
Jan 26 10:01:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000038s ======
Jan 26 10:01:38 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:38.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000038s
Jan 26 10:01:38 compute-1 ceph-mon[80107]: pgmap v574: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 853 B/s wr, 2 op/s
Jan 26 10:01:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:39 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:39 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:39 compute-1 ceph-mon[80107]: pgmap v575: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 853 B/s wr, 2 op/s
Jan 26 10:01:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:40 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b64001bd0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:40.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:40.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:40 compute-1 sudo[226837]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:01:40 compute-1 sudo[226837]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:01:40 compute-1 sudo[226837]: pam_unix(sudo:session): session closed for user root
Jan 26 10:01:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:41 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c002910 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:41 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b400016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:42 compute-1 ceph-mon[80107]: pgmap v576: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 10:01:42 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:42 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:42 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:42 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:01:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:42.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:01:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:42 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:42.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:01:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:43 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b640089d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:43 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c002910 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:44 compute-1 ceph-mon[80107]: pgmap v577: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 170 B/s wr, 1 op/s
Jan 26 10:01:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:44 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:44.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:44.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:45 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:01:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:45 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:01:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:45 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:45 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b640089d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:46 compute-1 ceph-mon[80107]: pgmap v578: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 682 B/s wr, 2 op/s
Jan 26 10:01:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:46 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c002910 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:46.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:46.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:47 compute-1 podman[226865]: 2026-01-26 10:01:47.35297059 +0000 UTC m=+0.127510077 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 26 10:01:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:47 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:47 compute-1 nova_compute[226322]: 2026-01-26 10:01:47.644 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:01:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:47 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:47 compute-1 nova_compute[226322]: 2026-01-26 10:01:47.699 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:01:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:01:48 compute-1 ceph-mon[80107]: pgmap v579: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 596 B/s wr, 1 op/s
Jan 26 10:01:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:48 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:48 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 10:01:48 compute-1 sudo[226894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:01:48 compute-1 sudo[226894]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:01:48 compute-1 sudo[226894]: pam_unix(sudo:session): session closed for user root
Jan 26 10:01:48 compute-1 sudo[226919]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:01:48 compute-1 sudo[226919]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:01:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000038s ======
Jan 26 10:01:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:48.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000038s
Jan 26 10:01:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:48.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:48 compute-1 sudo[226919]: pam_unix(sudo:session): session closed for user root
Jan 26 10:01:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:01:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:01:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:01:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:01:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:01:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:01:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:01:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:01:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:49 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:49 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:50 compute-1 ceph-mon[80107]: pgmap v580: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 596 B/s wr, 1 op/s
Jan 26 10:01:50 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:50 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:50.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:50.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:51 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:51 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b640096e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:52 compute-1 ceph-mon[80107]: pgmap v581: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 10:01:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:52 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003a10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 10:01:52 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1793256565' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:01:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 10:01:52 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1793256565' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:01:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:52.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:52.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:01:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 10:01:52 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3723518383' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:01:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 10:01:52 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3723518383' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:01:53 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/1793256565' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:01:53 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/1793256565' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:01:53 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/289274094' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:01:53 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/289274094' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:01:53 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/3723518383' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:01:53 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/3723518383' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:01:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:53 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003a10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:53 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b400032f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:01:53.923 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:01:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:01:53.924 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:01:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:01:53.924 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:01:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:54 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b640096e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:54 compute-1 ceph-mon[80107]: pgmap v582: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Jan 26 10:01:54 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:01:54 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:01:54 compute-1 sudo[226977]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:01:54 compute-1 sudo[226977]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:01:54 compute-1 sudo[226977]: pam_unix(sudo:session): session closed for user root
Jan 26 10:01:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:54.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:54.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100154 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 10:01:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:55 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003a10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:55 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003a10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:56 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b400032f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:56 compute-1 ceph-mon[80107]: pgmap v583: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 26 10:01:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:56.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:56.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:57 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b6400a3f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:57 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003a10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:01:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:58 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003a10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:01:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:01:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:01:58 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:58.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:58.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:01:58 compute-1 ceph-mon[80107]: pgmap v584: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 10:01:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:59 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:01:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:59 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b6400a3f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:00 compute-1 ceph-mon[80107]: pgmap v585: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 10:02:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:00 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003a10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:02:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:00.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:00 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:00.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:00 compute-1 sudo[227005]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:02:00 compute-1 sudo[227005]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:02:00 compute-1 sudo[227005]: pam_unix(sudo:session): session closed for user root
Jan 26 10:02:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:01 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:01 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:02 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b6400a3f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:02 compute-1 ceph-mon[80107]: pgmap v586: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 10:02:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:02:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:02.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:02 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:02.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:02:03 compute-1 podman[227031]: 2026-01-26 10:02:03.280570353 +0000 UTC m=+0.055384207 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 26 10:02:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:03 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003a10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:03 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:04 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:04 compute-1 ceph-mon[80107]: pgmap v587: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 26 10:02:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:02:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:02:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:04 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:04.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:04.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:05 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b6400a3f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:05 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003a10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:06 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000038s ======
Jan 26 10:02:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:06.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000038s
Jan 26 10:02:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:02:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:06 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:06.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:06 compute-1 ceph-mon[80107]: pgmap v588: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 26 10:02:07 compute-1 sshd-session[227051]: Invalid user ubuntu from 104.248.198.71 port 44538
Jan 26 10:02:07 compute-1 sshd-session[227051]: Connection closed by invalid user ubuntu 104.248.198.71 port 44538 [preauth]
Jan 26 10:02:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:07 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:07 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b6400a3f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:02:07 compute-1 ceph-mon[80107]: pgmap v589: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:02:08 compute-1 sshd-session[227053]: Connection closed by authenticating user root 178.62.249.31 port 36702 [preauth]
Jan 26 10:02:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:08 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b30000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:02:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:08.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:08 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:08.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:09 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:09 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:10 compute-1 ceph-mon[80107]: pgmap v590: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:02:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:10 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:10.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:02:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:10 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:10.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:11 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b300016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:11 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:12 compute-1 ceph-mon[80107]: pgmap v591: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 26 10:02:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:12 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:12.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:12.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:02:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:13 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:13 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b300016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:14 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:14 compute-1 ceph-mon[80107]: pgmap v592: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:02:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:02:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000037s ======
Jan 26 10:02:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000037s ======
Jan 26 10:02:14 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:14.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000037s
Jan 26 10:02:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:14.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000037s
Jan 26 10:02:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:15 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:15 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b6400a3f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:16 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b300016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:02:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000038s ======
Jan 26 10:02:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:16.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000038s
Jan 26 10:02:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000038s ======
Jan 26 10:02:16 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:16.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000038s
Jan 26 10:02:16 compute-1 ceph-mon[80107]: pgmap v593: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 10:02:17 compute-1 sshd-session[227062]: Connection closed by authenticating user root 152.42.136.110 port 53216 [preauth]
Jan 26 10:02:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:17 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:17 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:17 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:02:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:18 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b6400a3f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:18 compute-1 podman[227065]: 2026-01-26 10:02:18.323486157 +0000 UTC m=+0.101123669 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Jan 26 10:02:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:02:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:18.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:18 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:18.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:18 compute-1 ceph-mon[80107]: pgmap v594: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:02:19 compute-1 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Jan 26 10:02:19 compute-1 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Jan 26 10:02:19 compute-1 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Jan 26 10:02:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:19 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b30002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:02:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:19 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b6400a3f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:20 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:20.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:02:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000037s ======
Jan 26 10:02:20 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:20.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000037s
Jan 26 10:02:20 compute-1 ceph-mon[80107]: pgmap v595: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:02:20 compute-1 sudo[227095]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:02:20 compute-1 sudo[227095]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:02:20 compute-1 sudo[227095]: pam_unix(sudo:session): session closed for user root
Jan 26 10:02:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:21 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:21 compute-1 nova_compute[226322]: 2026-01-26 10:02:21.688 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:02:21 compute-1 nova_compute[226322]: 2026-01-26 10:02:21.689 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:02:21 compute-1 nova_compute[226322]: 2026-01-26 10:02:21.689 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:02:21 compute-1 nova_compute[226322]: 2026-01-26 10:02:21.689 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:02:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:21 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b30002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:21 compute-1 ceph-mon[80107]: pgmap v596: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 24 KiB/s rd, 0 B/s wr, 40 op/s
Jan 26 10:02:21 compute-1 nova_compute[226322]: 2026-01-26 10:02:21.965 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 10:02:21 compute-1 nova_compute[226322]: 2026-01-26 10:02:21.965 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:02:21 compute-1 nova_compute[226322]: 2026-01-26 10:02:21.965 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:02:21 compute-1 nova_compute[226322]: 2026-01-26 10:02:21.966 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:02:21 compute-1 nova_compute[226322]: 2026-01-26 10:02:21.966 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:02:21 compute-1 nova_compute[226322]: 2026-01-26 10:02:21.966 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:02:21 compute-1 nova_compute[226322]: 2026-01-26 10:02:21.966 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:02:21 compute-1 nova_compute[226322]: 2026-01-26 10:02:21.966 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:02:21 compute-1 nova_compute[226322]: 2026-01-26 10:02:21.967 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:02:21 compute-1 nova_compute[226322]: 2026-01-26 10:02:21.996 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:02:21 compute-1 nova_compute[226322]: 2026-01-26 10:02:21.997 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:02:21 compute-1 nova_compute[226322]: 2026-01-26 10:02:21.997 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:02:21 compute-1 nova_compute[226322]: 2026-01-26 10:02:21.997 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:02:21 compute-1 nova_compute[226322]: 2026-01-26 10:02:21.997 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:02:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:22 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b6400a3f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:02:22 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2408861058' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:02:22 compute-1 nova_compute[226322]: 2026-01-26 10:02:22.415 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:02:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:02:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:22.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000038s ======
Jan 26 10:02:22 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:22.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000038s
Jan 26 10:02:22 compute-1 nova_compute[226322]: 2026-01-26 10:02:22.628 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:02:22 compute-1 nova_compute[226322]: 2026-01-26 10:02:22.629 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5252MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:02:22 compute-1 nova_compute[226322]: 2026-01-26 10:02:22.629 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:02:22 compute-1 nova_compute[226322]: 2026-01-26 10:02:22.629 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:02:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:02:22 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2408861058' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:02:22 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/555301251' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:02:22 compute-1 nova_compute[226322]: 2026-01-26 10:02:22.995 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:02:22 compute-1 nova_compute[226322]: 2026-01-26 10:02:22.996 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:02:23 compute-1 nova_compute[226322]: 2026-01-26 10:02:23.029 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:02:23 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:02:23 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4273966476' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:02:23 compute-1 nova_compute[226322]: 2026-01-26 10:02:23.468 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:02:23 compute-1 nova_compute[226322]: 2026-01-26 10:02:23.478 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:02:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:23 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:23 compute-1 nova_compute[226322]: 2026-01-26 10:02:23.511 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:02:23 compute-1 nova_compute[226322]: 2026-01-26 10:02:23.512 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:02:23 compute-1 nova_compute[226322]: 2026-01-26 10:02:23.512 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.883s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:02:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:23 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:24 compute-1 ceph-mon[80107]: pgmap v597: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 24 KiB/s rd, 0 B/s wr, 40 op/s
Jan 26 10:02:24 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3685014642' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:02:24 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/4273966476' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:02:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:24 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b30002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:02:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:24.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000037s ======
Jan 26 10:02:24 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:24.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000037s
Jan 26 10:02:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:25 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b6400a3f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:25 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:26 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:26.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:26.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:26 compute-1 ceph-mon[80107]: pgmap v598: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 44 KiB/s rd, 0 B/s wr, 72 op/s
Jan 26 10:02:26 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1980237738' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:02:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:27 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:27 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b30003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:02:27 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3992081513' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:02:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:28 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b6400a3f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:28.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:02:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:28.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:02:28 compute-1 ceph-mon[80107]: pgmap v599: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 44 KiB/s rd, 0 B/s wr, 72 op/s
Jan 26 10:02:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:29 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:29 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:29 compute-1 ceph-mon[80107]: pgmap v600: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 44 KiB/s rd, 0 B/s wr, 72 op/s
Jan 26 10:02:30 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:30 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b30003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:30.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:02:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:30.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:02:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:31 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b6400a3f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:31 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:32 compute-1 ceph-mon[80107]: pgmap v601: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 44 KiB/s rd, 0 B/s wr, 72 op/s
Jan 26 10:02:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:32 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:32.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:32.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:02:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:33 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b30003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:33 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b6400a3f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:34 compute-1 ceph-mon[80107]: pgmap v602: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 0 B/s wr, 32 op/s
Jan 26 10:02:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:02:34 compute-1 podman[227171]: 2026-01-26 10:02:34.266365767 +0000 UTC m=+0.050615049 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 26 10:02:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:02:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:34.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:02:34 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:34.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:02:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:35 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:35 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b30003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:36 compute-1 ceph-mon[80107]: pgmap v603: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 0 B/s wr, 32 op/s
Jan 26 10:02:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:36 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b6400a3f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:02:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:36.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:36 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:36.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:37 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:37 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:02:38 compute-1 ceph-mon[80107]: pgmap v604: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:02:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:38 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c002100 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:02:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:38.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:02:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:02:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:38 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:38.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:39 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b6400a3f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:39 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:40 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b38000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:40 compute-1 ceph-mon[80107]: pgmap v605: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:02:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:02:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:40 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:40.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:02:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:40.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:02:41 compute-1 sudo[227195]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:02:41 compute-1 sudo[227195]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:02:41 compute-1 sudo[227195]: pam_unix(sudo:session): session closed for user root
Jan 26 10:02:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:41 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c002100 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:41 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b6400a3f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:42 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:42 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:02:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:42.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:42 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:42.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:42 compute-1 ceph-mon[80107]: pgmap v606: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 26 10:02:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:02:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:43 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:43 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b380016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:44 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:44.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:44.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:44 compute-1 ceph-mon[80107]: pgmap v607: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:02:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:45 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:45 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:46 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b380016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:02:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:46.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:02:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:02:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:02:46 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:46.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:02:46 compute-1 ceph-mon[80107]: pgmap v608: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 10:02:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:47 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c002c50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:47 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:02:48 compute-1 rsyslogd[1005]: imjournal: 4831 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 26 10:02:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:48 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:48.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:02:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:48.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:02:48 compute-1 ceph-mon[80107]: pgmap v609: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:02:49 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Jan 26 10:02:49 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3978845484' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Jan 26 10:02:49 compute-1 podman[227225]: 2026-01-26 10:02:49.301493202 +0000 UTC m=+0.081643194 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 10:02:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:49 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b380016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:02:49 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/865647127' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Jan 26 10:02:49 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/3978845484' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Jan 26 10:02:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:49 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c002c50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:50 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:50 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:50.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:02:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:50.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:02:50 compute-1 ceph-mon[80107]: pgmap v610: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:02:50 compute-1 ceph-mon[80107]: from='client.24583 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Jan 26 10:02:50 compute-1 ceph-mon[80107]: from='client.24614 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Jan 26 10:02:50 compute-1 ceph-mon[80107]: from='client.24583 -' entity='client.openstack' cmd=[{"prefix": "nfs cluster info", "cluster_id": "cephfs", "format": "json"}]: dispatch
Jan 26 10:02:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:51 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:51 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b38002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:52 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:02:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:52.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:02:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:52.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:52 compute-1 ceph-mon[80107]: pgmap v611: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 26 10:02:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:02:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:53 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:53 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:02:53.925 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:02:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:02:53.925 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:02:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:02:53.925 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:02:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:54 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b38002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:54.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:02:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:54.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:02:54 compute-1 sudo[227256]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:02:54 compute-1 sudo[227256]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:02:54 compute-1 sudo[227256]: pam_unix(sudo:session): session closed for user root
Jan 26 10:02:54 compute-1 ceph-mon[80107]: pgmap v612: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:02:54 compute-1 sudo[227281]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:02:54 compute-1 sudo[227281]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:02:55 compute-1 sudo[227281]: pam_unix(sudo:session): session closed for user root
Jan 26 10:02:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:55 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:55 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:55 compute-1 sshd-session[227254]: Connection closed by authenticating user root 178.62.249.31 port 45818 [preauth]
Jan 26 10:02:56 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:02:56 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:02:56 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:02:56 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:02:56 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:02:56 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:02:56 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:02:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:56 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:56 compute-1 sshd-session[227337]: Invalid user ubuntu from 104.248.198.71 port 37434
Jan 26 10:02:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:56.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:56.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:56 compute-1 sshd-session[227337]: Connection closed by invalid user ubuntu 104.248.198.71 port 37434 [preauth]
Jan 26 10:02:57 compute-1 ceph-mon[80107]: pgmap v613: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 10:02:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:57 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:57 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:02:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:58 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:58 compute-1 ceph-mon[80107]: pgmap v614: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:02:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:02:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:58.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:02:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:02:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:02:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:58.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:02:58 compute-1 sshd-session[227340]: Connection closed by authenticating user root 152.42.136.110 port 56012 [preauth]
Jan 26 10:02:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/914721217' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:02:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/914721217' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:02:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:59 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b38003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:02:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:59 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:00 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:00 compute-1 sudo[227343]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:03:00 compute-1 sudo[227343]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:03:00 compute-1 sudo[227343]: pam_unix(sudo:session): session closed for user root
Jan 26 10:03:00 compute-1 ceph-mon[80107]: pgmap v615: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:03:00 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:03:00 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:03:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:03:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:00.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:03:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:00.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:01 compute-1 sudo[227368]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:03:01 compute-1 sudo[227368]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:03:01 compute-1 sudo[227368]: pam_unix(sudo:session): session closed for user root
Jan 26 10:03:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:01 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:01 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b38003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:02 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:02 compute-1 ceph-mon[80107]: pgmap v616: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 36 KiB/s rd, 0 B/s wr, 59 op/s
Jan 26 10:03:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:02.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:03:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:02.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:03:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:03:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:03 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:03 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:04 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:04.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:03:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:04.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:03:04 compute-1 ceph-mon[80107]: pgmap v617: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 36 KiB/s rd, 0 B/s wr, 59 op/s
Jan 26 10:03:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:03:05 compute-1 podman[227395]: 2026-01-26 10:03:05.268426207 +0000 UTC m=+0.044943605 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 26 10:03:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:05 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:05 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b38003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:06 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:06.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:06.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:06 compute-1 ceph-mon[80107]: pgmap v618: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 36 KiB/s rd, 0 B/s wr, 59 op/s
Jan 26 10:03:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:07 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:07 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:03:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:08 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b38003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:08.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:08.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:08 compute-1 ceph-mon[80107]: pgmap v619: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 36 KiB/s rd, 0 B/s wr, 59 op/s
Jan 26 10:03:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:09 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:09 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:10 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:03:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:10.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:10 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:10.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:10 compute-1 ceph-mon[80107]: pgmap v620: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 36 KiB/s rd, 0 B/s wr, 59 op/s
Jan 26 10:03:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:11 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:11 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:11 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/611339985' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Jan 26 10:03:11 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/3825936780' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Jan 26 10:03:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:12 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:12.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:12.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:12 compute-1 ceph-mon[80107]: pgmap v621: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 36 KiB/s rd, 0 B/s wr, 59 op/s
Jan 26 10:03:12 compute-1 ceph-mon[80107]: from='client.24598 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Jan 26 10:03:12 compute-1 ceph-mon[80107]: from='client.24598 -' entity='client.openstack' cmd=[{"prefix": "nfs cluster info", "cluster_id": "cephfs", "format": "json"}]: dispatch
Jan 26 10:03:12 compute-1 ceph-mon[80107]: from='client.15099 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Jan 26 10:03:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:03:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:13 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:13 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:14 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b30002690 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:03:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:14.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:03:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:03:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:14 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:14.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:14 compute-1 ceph-mon[80107]: pgmap v622: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:03:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:15 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:15 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:16 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:16.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:16.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:16 compute-1 ceph-mon[80107]: pgmap v623: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 10:03:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:17 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:17 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b30002690 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:17 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:03:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:18 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:03:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:18.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:03:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:03:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:18 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:18.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:18 compute-1 ceph-mon[80107]: pgmap v624: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:03:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:03:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:19 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:19 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c004a60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:19 compute-1 ceph-mon[80107]: pgmap v625: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:03:20 compute-1 podman[227424]: 2026-01-26 10:03:20.318759564 +0000 UTC m=+0.093348527 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 10:03:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:20 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c004a60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:03:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:03:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:20.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:03:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:20 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:20.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:21 compute-1 sudo[227452]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:03:21 compute-1 sudo[227452]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:03:21 compute-1 sudo[227452]: pam_unix(sudo:session): session closed for user root
Jan 26 10:03:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:21 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:21 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:22 compute-1 ceph-mon[80107]: pgmap v626: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 26 10:03:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:22 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:03:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:03:22 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:22.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:22.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:03:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:03:23 compute-1 nova_compute[226322]: 2026-01-26 10:03:23.505 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:03:23 compute-1 nova_compute[226322]: 2026-01-26 10:03:23.527 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:03:23 compute-1 nova_compute[226322]: 2026-01-26 10:03:23.528 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:03:23 compute-1 nova_compute[226322]: 2026-01-26 10:03:23.528 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:03:23 compute-1 nova_compute[226322]: 2026-01-26 10:03:23.528 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:03:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:23 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b30002690 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:23 compute-1 nova_compute[226322]: 2026-01-26 10:03:23.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:03:23 compute-1 nova_compute[226322]: 2026-01-26 10:03:23.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:03:23 compute-1 nova_compute[226322]: 2026-01-26 10:03:23.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:03:23 compute-1 nova_compute[226322]: 2026-01-26 10:03:23.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:03:23 compute-1 nova_compute[226322]: 2026-01-26 10:03:23.706 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 10:03:23 compute-1 nova_compute[226322]: 2026-01-26 10:03:23.707 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:03:23 compute-1 nova_compute[226322]: 2026-01-26 10:03:23.707 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:03:23 compute-1 nova_compute[226322]: 2026-01-26 10:03:23.707 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:03:23 compute-1 nova_compute[226322]: 2026-01-26 10:03:23.708 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:03:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:23 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:23 compute-1 nova_compute[226322]: 2026-01-26 10:03:23.983 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:03:23 compute-1 nova_compute[226322]: 2026-01-26 10:03:23.983 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:03:23 compute-1 nova_compute[226322]: 2026-01-26 10:03:23.983 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:03:23 compute-1 nova_compute[226322]: 2026-01-26 10:03:23.984 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:03:23 compute-1 nova_compute[226322]: 2026-01-26 10:03:23.984 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:03:24 compute-1 ceph-mon[80107]: pgmap v627: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:03:24 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3628511201' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:03:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:24 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c004a60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:24 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:03:24 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1818591514' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:03:24 compute-1 nova_compute[226322]: 2026-01-26 10:03:24.455 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:03:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:03:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:03:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:24.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:03:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:03:24 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:24.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:03:24 compute-1 nova_compute[226322]: 2026-01-26 10:03:24.607 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:03:24 compute-1 nova_compute[226322]: 2026-01-26 10:03:24.609 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5245MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:03:24 compute-1 nova_compute[226322]: 2026-01-26 10:03:24.609 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:03:24 compute-1 nova_compute[226322]: 2026-01-26 10:03:24.609 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:03:24 compute-1 nova_compute[226322]: 2026-01-26 10:03:24.673 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:03:24 compute-1 nova_compute[226322]: 2026-01-26 10:03:24.673 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:03:24 compute-1 nova_compute[226322]: 2026-01-26 10:03:24.691 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:03:25 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:03:25 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1487621882' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:03:25 compute-1 nova_compute[226322]: 2026-01-26 10:03:25.164 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:03:25 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1818591514' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:03:25 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/642867322' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:03:25 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1487621882' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:03:25 compute-1 nova_compute[226322]: 2026-01-26 10:03:25.169 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:03:25 compute-1 nova_compute[226322]: 2026-01-26 10:03:25.189 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:03:25 compute-1 nova_compute[226322]: 2026-01-26 10:03:25.191 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:03:25 compute-1 nova_compute[226322]: 2026-01-26 10:03:25.191 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.581s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:03:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:25 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:25 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:26 compute-1 ceph-mon[80107]: pgmap v628: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 10:03:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:26 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:03:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:26.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:03:26 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:26.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:03:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:27 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c004a60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:27 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b30002e50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:03:28 compute-1 ceph-mon[80107]: pgmap v629: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:03:28 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/497130841' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:03:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:28 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:28.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:28.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:29 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/799836587' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:03:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:29 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:29 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c004a60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:30 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:30 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b30002e50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:30 compute-1 ceph-mon[80107]: pgmap v630: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:03:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:30.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:30.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:31 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:31 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:32 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c004a60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:32.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:32.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:32 compute-1 ceph-mon[80107]: pgmap v631: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 26 10:03:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:03:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:33 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c004a60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:33 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:03:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:03:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:34.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:03:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:34.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:34 compute-1 ceph-mon[80107]: pgmap v632: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:03:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:03:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:35 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b30002e50 fd 47 proxy ignored for local
Jan 26 10:03:35 compute-1 kernel: ganesha.nfsd[227420]: segfault at 50 ip 00007f0be774532e sp 00007f0b6effc210 error 4 in libntirpc.so.5.8[7f0be772a000+2c000] likely on CPU 1 (core 0, socket 1)
Jan 26 10:03:35 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 26 10:03:35 compute-1 systemd[1]: Started Process Core Dump (PID 227529/UID 0).
Jan 26 10:03:35 compute-1 podman[227530]: 2026-01-26 10:03:35.685440751 +0000 UTC m=+0.090814902 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 10:03:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:36.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:36.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:36 compute-1 ceph-mon[80107]: pgmap v633: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 10:03:36 compute-1 systemd-coredump[227531]: Process 226476 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 61:
                                                    #0  0x00007f0be774532e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Jan 26 10:03:36 compute-1 systemd[1]: systemd-coredump@8-227529-0.service: Deactivated successfully.
Jan 26 10:03:36 compute-1 systemd[1]: systemd-coredump@8-227529-0.service: Consumed 1.262s CPU time.
Jan 26 10:03:36 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 10:03:36 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 10:03:37 compute-1 podman[227555]: 2026-01-26 10:03:37.011449083 +0000 UTC m=+0.026584871 container died 3a6ae4dc2df3af4a6bf4e06b720b3b075b63d61c4adfe2b2c9d563de7c3cae7c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 10:03:37 compute-1 systemd[1]: var-lib-containers-storage-overlay-93dfa11a05619004f732a88ff930df4934d43a692e19d27b8c03adb369fa54e2-merged.mount: Deactivated successfully.
Jan 26 10:03:37 compute-1 podman[227555]: 2026-01-26 10:03:37.056555899 +0000 UTC m=+0.071691707 container remove 3a6ae4dc2df3af4a6bf4e06b720b3b075b63d61c4adfe2b2c9d563de7c3cae7c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2)
Jan 26 10:03:37 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Main process exited, code=exited, status=139/n/a
Jan 26 10:03:37 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Failed with result 'exit-code'.
Jan 26 10:03:37 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.780s CPU time.
Jan 26 10:03:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:03:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:03:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:38.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:38 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:38.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:38 compute-1 ceph-mon[80107]: pgmap v634: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:03:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:40.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:40.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:40 compute-1 ceph-mon[80107]: pgmap v635: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:03:41 compute-1 sudo[227604]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:03:41 compute-1 sudo[227604]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:03:41 compute-1 sudo[227604]: pam_unix(sudo:session): session closed for user root
Jan 26 10:03:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100341 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 10:03:41 compute-1 sshd-session[227600]: Connection closed by authenticating user root 152.42.136.110 port 55384 [preauth]
Jan 26 10:03:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:03:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:42.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:03:42 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:42.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:03:42 compute-1 sshd-session[227602]: Connection closed by authenticating user root 178.62.249.31 port 55648 [preauth]
Jan 26 10:03:42 compute-1 ceph-mon[80107]: pgmap v636: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 10:03:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:03:44 compute-1 ceph-mon[80107]: pgmap v637: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 10:03:44 compute-1 sshd-session[227631]: Invalid user ubuntu from 104.248.198.71 port 60964
Jan 26 10:03:44 compute-1 sshd-session[227631]: Connection closed by invalid user ubuntu 104.248.198.71 port 60964 [preauth]
Jan 26 10:03:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:44.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:03:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:44.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:03:46 compute-1 ceph-mon[80107]: pgmap v638: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:03:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:03:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:46.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:03:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:03:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:46.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.126357) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421827126427, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 2363, "num_deletes": 251, "total_data_size": 6264099, "memory_usage": 6368336, "flush_reason": "Manual Compaction"}
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421827150421, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 4098766, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20744, "largest_seqno": 23102, "table_properties": {"data_size": 4089234, "index_size": 6026, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19523, "raw_average_key_size": 20, "raw_value_size": 4070294, "raw_average_value_size": 4213, "num_data_blocks": 264, "num_entries": 966, "num_filter_entries": 966, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769421607, "oldest_key_time": 1769421607, "file_creation_time": 1769421827, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 24252 microseconds, and 9785 cpu microseconds.
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.150604) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 4098766 bytes OK
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.150651) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.152793) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.152830) EVENT_LOG_v1 {"time_micros": 1769421827152819, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.152890) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 6253727, prev total WAL file size 6253727, number of live WAL files 2.
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.156145) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(4002KB)], [39(12MB)]
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421827156267, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 17167483, "oldest_snapshot_seqno": -1}
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 5417 keys, 14966389 bytes, temperature: kUnknown
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421827244244, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 14966389, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14927647, "index_size": 24104, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13573, "raw_key_size": 136564, "raw_average_key_size": 25, "raw_value_size": 14827180, "raw_average_value_size": 2737, "num_data_blocks": 997, "num_entries": 5417, "num_filter_entries": 5417, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769421827, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.244512) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 14966389 bytes
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.245674) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 194.9 rd, 169.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.9, 12.5 +0.0 blob) out(14.3 +0.0 blob), read-write-amplify(7.8) write-amplify(3.7) OK, records in: 5937, records dropped: 520 output_compression: NoCompression
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.245811) EVENT_LOG_v1 {"time_micros": 1769421827245683, "job": 22, "event": "compaction_finished", "compaction_time_micros": 88067, "compaction_time_cpu_micros": 31373, "output_level": 6, "num_output_files": 1, "total_output_size": 14966389, "num_input_records": 5937, "num_output_records": 5417, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421827246798, "job": 22, "event": "table_file_deletion", "file_number": 41}
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421827249465, "job": 22, "event": "table_file_deletion", "file_number": 39}
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.156013) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.249508) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.249513) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.249515) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.249517) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:03:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.249519) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:03:47 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Scheduled restart job, restart counter is at 9.
Jan 26 10:03:47 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 10:03:47 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.780s CPU time.
Jan 26 10:03:47 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 10:03:47 compute-1 podman[227683]: 2026-01-26 10:03:47.678473191 +0000 UTC m=+0.043417490 container create da36348c21b1c64499d70f03079a8c10f8292749eea8acac8020d0941757a01f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True)
Jan 26 10:03:47 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/388b2e91ee2994701e041fb7b4f25f4a9af1107644f6559cc4d916cf9618d53c/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 10:03:47 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/388b2e91ee2994701e041fb7b4f25f4a9af1107644f6559cc4d916cf9618d53c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 10:03:47 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/388b2e91ee2994701e041fb7b4f25f4a9af1107644f6559cc4d916cf9618d53c/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 10:03:47 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/388b2e91ee2994701e041fb7b4f25f4a9af1107644f6559cc4d916cf9618d53c/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 10:03:47 compute-1 podman[227683]: 2026-01-26 10:03:47.742671333 +0000 UTC m=+0.107615632 container init da36348c21b1c64499d70f03079a8c10f8292749eea8acac8020d0941757a01f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 10:03:47 compute-1 podman[227683]: 2026-01-26 10:03:47.749099781 +0000 UTC m=+0.114044050 container start da36348c21b1c64499d70f03079a8c10f8292749eea8acac8020d0941757a01f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Jan 26 10:03:47 compute-1 bash[227683]: da36348c21b1c64499d70f03079a8c10f8292749eea8acac8020d0941757a01f
Jan 26 10:03:47 compute-1 podman[227683]: 2026-01-26 10:03:47.657719489 +0000 UTC m=+0.022663778 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 10:03:47 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 10:03:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 10:03:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 10:03:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 26 10:03:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 26 10:03:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 26 10:03:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 26 10:03:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 26 10:03:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:03:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:03:48 compute-1 ceph-mon[80107]: pgmap v639: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 10:03:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:03:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:48.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:03:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:48.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:03:50 compute-1 ceph-mon[80107]: pgmap v640: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 10:03:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:03:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:50.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:03:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:03:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:50.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:03:51 compute-1 podman[227741]: 2026-01-26 10:03:51.325466736 +0000 UTC m=+0.105832551 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 26 10:03:52 compute-1 ceph-mon[80107]: pgmap v641: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 26 10:03:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:52.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 26 10:03:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:52.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 26 10:03:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:03:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:53 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:03:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:53 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:03:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:53 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:03:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:03:53.926 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:03:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:03:53.927 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:03:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:03:53.927 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:03:54 compute-1 ceph-mon[80107]: pgmap v642: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Jan 26 10:03:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:03:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:54.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:03:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:54.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100354 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 10:03:56 compute-1 ceph-mon[80107]: pgmap v643: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 10:03:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:03:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:56.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:03:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:56.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:03:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:57 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:03:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:57 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:03:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:57 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:03:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:58 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:03:58 compute-1 ceph-mon[80107]: pgmap v644: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 10:03:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/1736252094' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:03:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/1736252094' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:03:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:58.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:03:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:03:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:03:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:58.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:00 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:04:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:00 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:04:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:00 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:04:00 compute-1 ceph-mon[80107]: pgmap v645: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Jan 26 10:04:00 compute-1 sudo[227773]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:04:00 compute-1 sudo[227773]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:04:00 compute-1 sudo[227773]: pam_unix(sudo:session): session closed for user root
Jan 26 10:04:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:00.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:00.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:00 compute-1 sudo[227798]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:04:00 compute-1 sudo[227798]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:04:01 compute-1 sudo[227798]: pam_unix(sudo:session): session closed for user root
Jan 26 10:04:01 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:04:01 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:04:01 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:04:01 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:04:01 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:04:01 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:04:01 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:04:01 compute-1 sudo[227854]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:04:01 compute-1 sudo[227854]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:04:01 compute-1 sudo[227854]: pam_unix(sudo:session): session closed for user root
Jan 26 10:04:02 compute-1 ceph-mon[80107]: pgmap v646: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 426 B/s wr, 2 op/s
Jan 26 10:04:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 26 10:04:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:02.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 26 10:04:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:02.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:04:03 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:04:03.259 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 10:04:03 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:04:03.262 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 10:04:03 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:04:03.263 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:04:04 compute-1 ceph-mon[80107]: pgmap v647: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 341 B/s wr, 1 op/s
Jan 26 10:04:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:04:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:04.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:04.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:06 compute-1 podman[227883]: 2026-01-26 10:04:06.317530882 +0000 UTC m=+0.083316611 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f034c000df0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 26 10:04:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 10:04:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:06.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:06.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:07 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:07 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:04:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:08 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:08.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:08.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100409 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 10:04:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:09 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:09 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:10 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03280016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:10.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:10.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:11 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100411 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 10:04:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:11 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:11 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:04:11 compute-1 ceph-mon[80107]: pgmap v648: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 10:04:12 compute-1 sudo[227920]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:04:12 compute-1 sudo[227920]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:04:12 compute-1 sudo[227920]: pam_unix(sudo:session): session closed for user root
Jan 26 10:04:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:12 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:12.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:12.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:04:12 compute-1 ceph-mon[80107]: pgmap v649: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 597 B/s wr, 2 op/s
Jan 26 10:04:12 compute-1 ceph-mon[80107]: pgmap v650: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 597 B/s wr, 2 op/s
Jan 26 10:04:12 compute-1 ceph-mon[80107]: pgmap v651: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 597 B/s wr, 2 op/s
Jan 26 10:04:12 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:04:12 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:04:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:13 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:13 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:14 compute-1 ceph-mon[80107]: pgmap v652: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 597 B/s wr, 2 op/s
Jan 26 10:04:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:14 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 26 10:04:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:14.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 26 10:04:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:14.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:14 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:04:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:14 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:04:15 compute-1 ceph-mon[80107]: Health check failed: 1 OSD(s) experiencing slow operations in BlueStore (BLUESTORE_SLOW_OP_ALERT)
Jan 26 10:04:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:15 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:15 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:16 compute-1 ceph-mon[80107]: pgmap v653: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 767 B/s wr, 3 op/s
Jan 26 10:04:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:16 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:04:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:16.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:04:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:16.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:17 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:04:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:17 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:04:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:17 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f033c001340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:17 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f033c001340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:17 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:04:18 compute-1 ceph-mon[80107]: pgmap v654: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 170 B/s wr, 1 op/s
Jan 26 10:04:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:18 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:04:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:18.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:04:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:18.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:19 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:04:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:04:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:19 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:20 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f033c0022e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:20 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 10:04:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:20 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:04:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:20.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:20.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:20 compute-1 ceph-mon[80107]: pgmap v655: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 170 B/s wr, 1 op/s
Jan 26 10:04:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:21 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f033c0022e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:21 compute-1 sudo[227950]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:04:21 compute-1 sudo[227950]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:04:21 compute-1 sudo[227950]: pam_unix(sudo:session): session closed for user root
Jan 26 10:04:21 compute-1 podman[227974]: 2026-01-26 10:04:21.68600953 +0000 UTC m=+0.071419848 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 26 10:04:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:21 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328002160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:22 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:04:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:22.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:04:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:22.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:22 compute-1 ceph-mon[80107]: pgmap v656: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 511 B/s wr, 2 op/s
Jan 26 10:04:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:04:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:23 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:04:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:23 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:04:23 compute-1 sshd-session[228002]: Connection closed by authenticating user root 152.42.136.110 port 44982 [preauth]
Jan 26 10:04:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:23 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:23 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:04:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:23 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:04:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:23 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:24 compute-1 nova_compute[226322]: 2026-01-26 10:04:24.170 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:04:24 compute-1 nova_compute[226322]: 2026-01-26 10:04:24.171 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:04:24 compute-1 nova_compute[226322]: 2026-01-26 10:04:24.171 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:04:24 compute-1 nova_compute[226322]: 2026-01-26 10:04:24.171 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:04:24 compute-1 nova_compute[226322]: 2026-01-26 10:04:24.192 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:04:24 compute-1 nova_compute[226322]: 2026-01-26 10:04:24.192 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:04:24 compute-1 nova_compute[226322]: 2026-01-26 10:04:24.193 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:04:24 compute-1 nova_compute[226322]: 2026-01-26 10:04:24.193 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:04:24 compute-1 nova_compute[226322]: 2026-01-26 10:04:24.193 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:04:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:24 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:24 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:04:24 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/613830193' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:04:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:24.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:24.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:24 compute-1 nova_compute[226322]: 2026-01-26 10:04:24.647 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:04:24 compute-1 ceph-mon[80107]: pgmap v657: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 511 B/s wr, 2 op/s
Jan 26 10:04:24 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3030815994' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:04:24 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/613830193' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:04:24 compute-1 nova_compute[226322]: 2026-01-26 10:04:24.803 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:04:24 compute-1 nova_compute[226322]: 2026-01-26 10:04:24.806 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5235MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:04:24 compute-1 nova_compute[226322]: 2026-01-26 10:04:24.806 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:04:24 compute-1 nova_compute[226322]: 2026-01-26 10:04:24.807 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:04:24 compute-1 nova_compute[226322]: 2026-01-26 10:04:24.870 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:04:24 compute-1 nova_compute[226322]: 2026-01-26 10:04:24.870 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:04:24 compute-1 nova_compute[226322]: 2026-01-26 10:04:24.884 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:04:25 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:04:25 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/456321413' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:04:25 compute-1 nova_compute[226322]: 2026-01-26 10:04:25.348 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:04:25 compute-1 nova_compute[226322]: 2026-01-26 10:04:25.354 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:04:25 compute-1 nova_compute[226322]: 2026-01-26 10:04:25.376 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:04:25 compute-1 nova_compute[226322]: 2026-01-26 10:04:25.378 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:04:25 compute-1 nova_compute[226322]: 2026-01-26 10:04:25.379 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.572s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:04:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:25 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f033c0022e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:25 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340003ca0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:25 compute-1 nova_compute[226322]: 2026-01-26 10:04:25.889 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:04:25 compute-1 nova_compute[226322]: 2026-01-26 10:04:25.889 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:04:25 compute-1 nova_compute[226322]: 2026-01-26 10:04:25.890 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:04:25 compute-1 nova_compute[226322]: 2026-01-26 10:04:25.890 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:04:25 compute-1 nova_compute[226322]: 2026-01-26 10:04:25.946 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 10:04:25 compute-1 nova_compute[226322]: 2026-01-26 10:04:25.947 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:04:25 compute-1 nova_compute[226322]: 2026-01-26 10:04:25.948 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:04:25 compute-1 nova_compute[226322]: 2026-01-26 10:04:25.948 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:04:25 compute-1 nova_compute[226322]: 2026-01-26 10:04:25.948 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:04:26 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3671987171' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:04:26 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/456321413' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:04:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:26 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 26 10:04:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:26.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 26 10:04:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:26.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100426 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 10:04:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:26 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 10:04:27 compute-1 ceph-mon[80107]: pgmap v658: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.9 KiB/s rd, 1.3 KiB/s wr, 5 op/s
Jan 26 10:04:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:27 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:27 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f033c0022e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:04:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:28 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:04:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:28.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:04:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:28.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:28 compute-1 ceph-mon[80107]: pgmap v659: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.2 KiB/s rd, 1.2 KiB/s wr, 4 op/s
Jan 26 10:04:29 compute-1 sshd-session[228051]: Connection closed by authenticating user root 178.62.249.31 port 33868 [preauth]
Jan 26 10:04:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:29 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:29 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1863012155' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:04:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:29 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:30 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:30 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f033c0022e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 10:04:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:30.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 10:04:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:30.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:30 compute-1 ceph-mon[80107]: pgmap v660: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.2 KiB/s rd, 1.2 KiB/s wr, 4 op/s
Jan 26 10:04:30 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1042271784' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:04:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:31 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100431 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 10:04:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:31 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:32 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:32 compute-1 sshd-session[228055]: Invalid user ubuntu from 104.248.198.71 port 53778
Jan 26 10:04:32 compute-1 sshd-session[228055]: Connection closed by invalid user ubuntu 104.248.198.71 port 53778 [preauth]
Jan 26 10:04:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:32.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 26 10:04:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:32.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 26 10:04:32 compute-1 ceph-mon[80107]: pgmap v661: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.8 KiB/s rd, 1.3 KiB/s wr, 5 op/s
Jan 26 10:04:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:04:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:33 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f033c0022e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:04:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:33 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:34 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:34.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:34.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:34 compute-1 ceph-mon[80107]: pgmap v662: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 10:04:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:35 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:35 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f033c0022e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:36 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:36.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:36.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:36 compute-1 ceph-mon[80107]: pgmap v663: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 26 10:04:37 compute-1 podman[228059]: 2026-01-26 10:04:37.281616817 +0000 UTC m=+0.060102630 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 26 10:04:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:37 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:37 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:04:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:38 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350001110 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:38.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:38.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:38 compute-1 ceph-mon[80107]: pgmap v664: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 596 B/s rd, 170 B/s wr, 0 op/s
Jan 26 10:04:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:39 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:39 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:40 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:40.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:40.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:40 compute-1 ceph-mon[80107]: pgmap v665: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 596 B/s rd, 170 B/s wr, 0 op/s
Jan 26 10:04:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:41 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350001110 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:41 compute-1 sudo[228086]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:04:41 compute-1 sudo[228086]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:04:41 compute-1 sudo[228086]: pam_unix(sudo:session): session closed for user root
Jan 26 10:04:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:41 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:42 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:42 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:42.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000006s ======
Jan 26 10:04:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:42.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000006s
Jan 26 10:04:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:04:42 compute-1 ceph-mon[80107]: pgmap v666: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 170 B/s wr, 1 op/s
Jan 26 10:04:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:43 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:43 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03500022a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:44 compute-1 ceph-mon[80107]: pgmap v667: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:04:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:44 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:44.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:44.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:45 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:45 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:46 compute-1 ceph-mon[80107]: pgmap v668: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:04:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:46 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03500022a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:46.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:46.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03500022a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:04:48 compute-1 ceph-mon[80107]: pgmap v669: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:04:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:48 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:48.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:48.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:49 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:04:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:49 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03500022a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:50 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:50 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:50 compute-1 ceph-mon[80107]: pgmap v670: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:04:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000006s ======
Jan 26 10:04:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:50.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000006s
Jan 26 10:04:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:50.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:51 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:51 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:52 compute-1 podman[228116]: 2026-01-26 10:04:52.376772619 +0000 UTC m=+0.153150873 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 26 10:04:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:52 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350003b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:52.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:52.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:52 compute-1 ceph-mon[80107]: pgmap v671: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 497 B/s rd, 0 op/s
Jan 26 10:04:53 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:04:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:53 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350003b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:53 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:04:53.927 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:04:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:04:53.927 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:04:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:04:53.927 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:04:54 compute-1 ceph-mon[80107]: pgmap v672: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 248 B/s rd, 0 op/s
Jan 26 10:04:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:54 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c000e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:54.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:54.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:55 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:55 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350003b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:56 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03200016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:56.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:56.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:57 compute-1 ceph-mon[80107]: pgmap v673: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 248 B/s rd, 0 op/s
Jan 26 10:04:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:57 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c001920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:57 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.083174) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421898083245, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 906, "num_deletes": 250, "total_data_size": 1850403, "memory_usage": 1869800, "flush_reason": "Manual Compaction"}
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421898188522, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 793278, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23107, "largest_seqno": 24008, "table_properties": {"data_size": 789864, "index_size": 1194, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 9210, "raw_average_key_size": 20, "raw_value_size": 782456, "raw_average_value_size": 1719, "num_data_blocks": 52, "num_entries": 455, "num_filter_entries": 455, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769421828, "oldest_key_time": 1769421828, "file_creation_time": 1769421898, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 105434 microseconds, and 4061 cpu microseconds.
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:04:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.188612) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 793278 bytes OK
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.188642) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.266715) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.266778) EVENT_LOG_v1 {"time_micros": 1769421898266766, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.266840) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 1845815, prev total WAL file size 1847663, number of live WAL files 2.
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.267852) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353030' seq:72057594037927935, type:22 .. '6D67727374617400373531' seq:0, type:0; will stop at (end)
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(774KB)], [42(14MB)]
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421898267979, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 15759667, "oldest_snapshot_seqno": -1}
Jan 26 10:04:58 compute-1 ceph-mon[80107]: pgmap v674: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 248 B/s rd, 0 op/s
Jan 26 10:04:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/192998050' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:04:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/192998050' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 5383 keys, 12101845 bytes, temperature: kUnknown
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421898337048, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 12101845, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12067130, "index_size": 20141, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13509, "raw_key_size": 136238, "raw_average_key_size": 25, "raw_value_size": 11971042, "raw_average_value_size": 2223, "num_data_blocks": 823, "num_entries": 5383, "num_filter_entries": 5383, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769421898, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.337292) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 12101845 bytes
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.338565) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 227.9 rd, 175.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 14.3 +0.0 blob) out(11.5 +0.0 blob), read-write-amplify(35.1) write-amplify(15.3) OK, records in: 5872, records dropped: 489 output_compression: NoCompression
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.338586) EVENT_LOG_v1 {"time_micros": 1769421898338577, "job": 24, "event": "compaction_finished", "compaction_time_micros": 69142, "compaction_time_cpu_micros": 29070, "output_level": 6, "num_output_files": 1, "total_output_size": 12101845, "num_input_records": 5872, "num_output_records": 5383, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421898338888, "job": 24, "event": "table_file_deletion", "file_number": 44}
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421898341864, "job": 24, "event": "table_file_deletion", "file_number": 42}
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.267677) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.341987) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.341993) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.341995) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.341997) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:04:58 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.341999) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:04:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:58 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350003b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:58.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:04:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:04:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:58.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:04:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:59 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03200016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:04:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:59 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c001920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:00 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:00 compute-1 ceph-mon[80107]: pgmap v675: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 248 B/s rd, 0 op/s
Jan 26 10:05:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000006s ======
Jan 26 10:05:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:00.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000006s
Jan 26 10:05:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:00.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:01 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:01 compute-1 sudo[228151]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:05:01 compute-1 sudo[228151]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:05:01 compute-1 sudo[228151]: pam_unix(sudo:session): session closed for user root
Jan 26 10:05:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:01 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03200016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:02 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c001920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:02.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:02.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:02 compute-1 ceph-mon[80107]: pgmap v676: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 497 B/s rd, 0 op/s
Jan 26 10:05:03 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:05:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:03 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004440 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:05:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:03 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:04 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000006s ======
Jan 26 10:05:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:04.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000006s
Jan 26 10:05:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000005s ======
Jan 26 10:05:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:04.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000005s
Jan 26 10:05:04 compute-1 ceph-mon[80107]: pgmap v677: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:05:05 compute-1 sshd-session[228176]: Connection closed by authenticating user root 152.42.136.110 port 53620 [preauth]
Jan 26 10:05:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:05 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c002db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:05 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004440 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03400046b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:06.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:06.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:06 compute-1 ceph-mon[80107]: pgmap v678: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:05:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:07 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:07 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:08 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:05:08 compute-1 podman[228181]: 2026-01-26 10:05:08.269587234 +0000 UTC m=+0.052014781 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 10:05:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:08 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004440 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:08.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:08.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:08 compute-1 ceph-mon[80107]: pgmap v679: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:05:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:09 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03400046b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:09 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:09 compute-1 ceph-mon[80107]: pgmap v680: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:05:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:10 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c0036d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000006s ======
Jan 26 10:05:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:10.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000006s
Jan 26 10:05:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:10.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:11 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004440 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:11 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:11 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03400046b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:12 compute-1 sudo[228202]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:05:12 compute-1 sudo[228202]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:05:12 compute-1 sudo[228202]: pam_unix(sudo:session): session closed for user root
Jan 26 10:05:12 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:12 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:12 compute-1 sudo[228227]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 26 10:05:12 compute-1 sudo[228227]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:05:12 compute-1 ceph-mon[80107]: pgmap v681: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 26 10:05:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:12.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000005s ======
Jan 26 10:05:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:12.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000005s
Jan 26 10:05:13 compute-1 podman[228323]: 2026-01-26 10:05:13.088309085 +0000 UTC m=+0.072197436 container exec 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.vendor=CentOS, ceph=True)
Jan 26 10:05:13 compute-1 podman[228323]: 2026-01-26 10:05:13.219067429 +0000 UTC m=+0.202955770 container exec_died 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 10:05:13 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:05:13 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 26 10:05:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:13 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c0036d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:13 compute-1 podman[228465]: 2026-01-26 10:05:13.859281937 +0000 UTC m=+0.085819528 container exec 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 10:05:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:13 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c0036d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:13 compute-1 podman[228488]: 2026-01-26 10:05:13.926929953 +0000 UTC m=+0.052805845 container exec_died 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 10:05:14 compute-1 podman[228465]: 2026-01-26 10:05:14.009456601 +0000 UTC m=+0.235994172 container exec_died 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 10:05:14 compute-1 podman[228535]: 2026-01-26 10:05:14.207876212 +0000 UTC m=+0.046913380 container exec da36348c21b1c64499d70f03079a8c10f8292749eea8acac8020d0941757a01f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 10:05:14 compute-1 podman[228535]: 2026-01-26 10:05:14.220175497 +0000 UTC m=+0.059212645 container exec_died da36348c21b1c64499d70f03079a8c10f8292749eea8acac8020d0941757a01f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 10:05:14 compute-1 podman[228601]: 2026-01-26 10:05:14.453462732 +0000 UTC m=+0.099025380 container exec 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 10:05:14 compute-1 podman[228601]: 2026-01-26 10:05:14.467368897 +0000 UTC m=+0.112931545 container exec_died 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 10:05:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:14 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03400046b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:14 compute-1 ceph-mon[80107]: pgmap v682: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:05:14 compute-1 podman[228666]: 2026-01-26 10:05:14.66441029 +0000 UTC m=+0.054485947 container exec 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, version=2.2.4, distribution-scope=public, name=keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, description=keepalived for Ceph, architecture=x86_64, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Jan 26 10:05:14 compute-1 sshd-session[228339]: Connection closed by authenticating user root 178.62.249.31 port 55544 [preauth]
Jan 26 10:05:14 compute-1 podman[228666]: 2026-01-26 10:05:14.702064321 +0000 UTC m=+0.092139978 container exec_died 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, version=2.2.4, architecture=x86_64, com.redhat.component=keepalived-container, distribution-scope=public, io.openshift.expose-services=, name=keepalived, description=keepalived for Ceph)
Jan 26 10:05:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:14.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:14.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:14 compute-1 sudo[228227]: pam_unix(sudo:session): session closed for user root
Jan 26 10:05:14 compute-1 sudo[228696]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:05:14 compute-1 sudo[228696]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:05:14 compute-1 sudo[228696]: pam_unix(sudo:session): session closed for user root
Jan 26 10:05:14 compute-1 sudo[228721]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:05:14 compute-1 sudo[228721]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:05:15 compute-1 sudo[228721]: pam_unix(sudo:session): session closed for user root
Jan 26 10:05:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:15 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:15 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:05:15 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:05:15 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:05:15 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:05:15 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 26 10:05:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:15 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c0036d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:16 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c0036d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:16.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:16.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:16 compute-1 ceph-mon[80107]: pgmap v683: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:05:16 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 26 10:05:16 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:05:16 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:05:16 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:05:16 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:05:16 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:05:16 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:05:16 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:05:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:17 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03400046b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:17 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:17 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:17 compute-1 ceph-mon[80107]: pgmap v684: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:05:18 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:05:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:18 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c0036d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:18.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000006s ======
Jan 26 10:05:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:18.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000006s
Jan 26 10:05:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:05:19 compute-1 sshd-session[228780]: Invalid user ubuntu from 104.248.198.71 port 36350
Jan 26 10:05:19 compute-1 sshd-session[228780]: Connection closed by invalid user ubuntu 104.248.198.71 port 36350 [preauth]
Jan 26 10:05:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:19 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c0036d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:19 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03400046b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:19 compute-1 ceph-mon[80107]: pgmap v685: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:05:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:20 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000005s ======
Jan 26 10:05:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:20.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000005s
Jan 26 10:05:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:20.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:20 compute-1 sudo[228783]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:05:20 compute-1 sudo[228783]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:05:20 compute-1 sudo[228783]: pam_unix(sudo:session): session closed for user root
Jan 26 10:05:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:21 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:21 compute-1 nova_compute[226322]: 2026-01-26 10:05:21.741 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:05:21 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:05:21 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:05:21 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:21 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:21 compute-1 sudo[228809]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:05:21 compute-1 sudo[228809]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:05:21 compute-1 sudo[228809]: pam_unix(sudo:session): session closed for user root
Jan 26 10:05:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:22 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03400046b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000006s ======
Jan 26 10:05:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:22.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000006s
Jan 26 10:05:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000006s ======
Jan 26 10:05:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:22.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000006s
Jan 26 10:05:22 compute-1 ceph-mon[80107]: pgmap v686: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 26 10:05:23 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:05:23 compute-1 podman[228834]: 2026-01-26 10:05:23.348687917 +0000 UTC m=+0.127335415 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 10:05:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:23 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:23 compute-1 nova_compute[226322]: 2026-01-26 10:05:23.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:05:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:23 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:23 compute-1 ceph-mon[80107]: pgmap v687: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:05:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:24 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:24 compute-1 nova_compute[226322]: 2026-01-26 10:05:24.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:05:24 compute-1 nova_compute[226322]: 2026-01-26 10:05:24.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:05:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:24.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:24.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:25 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1712979899' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:05:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:25 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03400046b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:25 compute-1 nova_compute[226322]: 2026-01-26 10:05:25.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:05:25 compute-1 nova_compute[226322]: 2026-01-26 10:05:25.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:05:25 compute-1 nova_compute[226322]: 2026-01-26 10:05:25.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:05:25 compute-1 nova_compute[226322]: 2026-01-26 10:05:25.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:05:25 compute-1 nova_compute[226322]: 2026-01-26 10:05:25.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:05:25 compute-1 nova_compute[226322]: 2026-01-26 10:05:25.714 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:05:25 compute-1 nova_compute[226322]: 2026-01-26 10:05:25.714 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:05:25 compute-1 nova_compute[226322]: 2026-01-26 10:05:25.714 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:05:25 compute-1 nova_compute[226322]: 2026-01-26 10:05:25.714 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:05:25 compute-1 nova_compute[226322]: 2026-01-26 10:05:25.715 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:05:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:25 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03400046b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:26 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:05:26 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1779943853' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:05:26 compute-1 nova_compute[226322]: 2026-01-26 10:05:26.161 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:05:26 compute-1 nova_compute[226322]: 2026-01-26 10:05:26.346 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:05:26 compute-1 nova_compute[226322]: 2026-01-26 10:05:26.348 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5200MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:05:26 compute-1 nova_compute[226322]: 2026-01-26 10:05:26.348 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:05:26 compute-1 nova_compute[226322]: 2026-01-26 10:05:26.349 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:05:26 compute-1 nova_compute[226322]: 2026-01-26 10:05:26.434 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:05:26 compute-1 nova_compute[226322]: 2026-01-26 10:05:26.435 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:05:26 compute-1 nova_compute[226322]: 2026-01-26 10:05:26.451 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:05:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:26 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03400046b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:26 compute-1 ceph-mon[80107]: pgmap v688: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:05:26 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1779943853' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:05:26 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/4109204512' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:05:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100526 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 10:05:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:26.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:26.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:26 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:05:26 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1283743844' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:05:26 compute-1 nova_compute[226322]: 2026-01-26 10:05:26.905 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:05:26 compute-1 nova_compute[226322]: 2026-01-26 10:05:26.909 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:05:26 compute-1 nova_compute[226322]: 2026-01-26 10:05:26.926 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:05:26 compute-1 nova_compute[226322]: 2026-01-26 10:05:26.928 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:05:26 compute-1 nova_compute[226322]: 2026-01-26 10:05:26.928 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.579s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:05:27 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1283743844' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:05:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:27 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:27 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004440 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:27 compute-1 nova_compute[226322]: 2026-01-26 10:05:27.928 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:05:27 compute-1 nova_compute[226322]: 2026-01-26 10:05:27.929 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:05:27 compute-1 nova_compute[226322]: 2026-01-26 10:05:27.929 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:05:27 compute-1 nova_compute[226322]: 2026-01-26 10:05:27.950 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 10:05:27 compute-1 nova_compute[226322]: 2026-01-26 10:05:27.951 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:05:28 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:05:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:28 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004440 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:28 compute-1 ceph-mon[80107]: pgmap v689: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:05:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:28.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:28.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:29 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03400046b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:29 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:30 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:30 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:30 compute-1 ceph-mon[80107]: pgmap v690: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:05:30 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2084782635' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:05:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:30.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 26 10:05:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:30.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 26 10:05:31 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3566203685' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:05:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:31 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100531 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 10:05:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:31 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:32 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:32 compute-1 ceph-mon[80107]: pgmap v691: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 10:05:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 10:05:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:32.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 10:05:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:32.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:33 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:05:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:33 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:33 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004440 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:34 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:34 compute-1 ceph-mon[80107]: pgmap v692: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Jan 26 10:05:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:05:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 26 10:05:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:34.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 26 10:05:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 26 10:05:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:34.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 26 10:05:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:35 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:05:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:35 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:35 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:35 compute-1 ceph-mon[80107]: pgmap v693: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 26 10:05:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:36 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004440 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:36.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:36.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000062s ======
Jan 26 10:05:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - - [26/Jan/2026:10:05:36.980 +0000] "GET /swift/info HTTP/1.1" 200 539 - "python-urllib3/1.26.5" - latency=0.002000062s
Jan 26 10:05:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:37 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:37 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:38 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:05:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:38 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:05:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:38 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:05:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:38 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:38 compute-1 ceph-mon[80107]: pgmap v694: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 26 10:05:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:38.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 26 10:05:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:38.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 26 10:05:39 compute-1 podman[228917]: 2026-01-26 10:05:39.307146172 +0000 UTC m=+0.082307901 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 10:05:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:39 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004460 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:39 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:40 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:40 compute-1 ceph-mon[80107]: pgmap v695: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 26 10:05:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:40.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000058s ======
Jan 26 10:05:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:40.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Jan 26 10:05:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:41 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 10:05:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:41 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:05:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:41 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:41 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004480 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:42 compute-1 sudo[228938]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:05:42 compute-1 sudo[228938]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:05:42 compute-1 sudo[228938]: pam_unix(sudo:session): session closed for user root
Jan 26 10:05:42 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:42 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:42.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:05:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:42.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:05:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e144 e144: 3 total, 3 up, 3 in
Jan 26 10:05:43 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:05:43 compute-1 ceph-mon[80107]: pgmap v696: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 767 B/s wr, 2 op/s
Jan 26 10:05:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:43 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:43 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004480 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:44 compute-1 ceph-mon[80107]: osdmap e144: 3 total, 3 up, 3 in
Jan 26 10:05:44 compute-1 ceph-mon[80107]: pgmap v698: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 921 B/s wr, 2 op/s
Jan 26 10:05:44 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e145 e145: 3 total, 3 up, 3 in
Jan 26 10:05:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:44 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:05:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:44 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:05:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:44 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:44 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:05:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:44 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:05:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:44.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:05:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:44.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:05:45 compute-1 ceph-mon[80107]: osdmap e145: 3 total, 3 up, 3 in
Jan 26 10:05:45 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e146 e146: 3 total, 3 up, 3 in
Jan 26 10:05:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:45 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328003760 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:45 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:46 compute-1 ceph-mon[80107]: pgmap v701: 353 pgs: 353 active+clean; 16 MiB data, 153 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 2.7 MiB/s wr, 28 op/s
Jan 26 10:05:46 compute-1 ceph-mon[80107]: osdmap e146: 3 total, 3 up, 3 in
Jan 26 10:05:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:46 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03500044a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:46 compute-1 sshd-session[228964]: Connection closed by authenticating user root 152.42.136.110 port 34898 [preauth]
Jan 26 10:05:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100546 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 10:05:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:05:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:46.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:05:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:46.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e147 e147: 3 total, 3 up, 3 in
Jan 26 10:05:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 10:05:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328003760 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:48 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e148 e148: 3 total, 3 up, 3 in
Jan 26 10:05:48 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:05:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:48 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:48 compute-1 ceph-mon[80107]: pgmap v702: 353 pgs: 353 active+clean; 16 MiB data, 153 MiB used, 60 GiB / 60 GiB avail; 16 KiB/s rd, 2.7 MiB/s wr, 24 op/s
Jan 26 10:05:48 compute-1 ceph-mon[80107]: osdmap e147: 3 total, 3 up, 3 in
Jan 26 10:05:48 compute-1 ceph-mon[80107]: osdmap e148: 3 total, 3 up, 3 in
Jan 26 10:05:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:48.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:48.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:49 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03500044c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:49 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03440041a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:50 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:50 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03440041a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:50 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:05:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:50.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:50.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:51 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:51 compute-1 ceph-mon[80107]: pgmap v705: 353 pgs: 353 active+clean; 16 MiB data, 153 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 3.2 MiB/s wr, 29 op/s
Jan 26 10:05:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:51 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03500044e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:52 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328003760 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:52.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:05:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:52.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:05:52 compute-1 ceph-mon[80107]: pgmap v706: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 32 KiB/s rd, 4.1 MiB/s wr, 46 op/s
Jan 26 10:05:53 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:05:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:53 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03440041a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100553 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 10:05:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:53 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03440041a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:05:53.927 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:05:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:05:53.928 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:05:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:05:53.928 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:05:54 compute-1 podman[228971]: 2026-01-26 10:05:54.330616833 +0000 UTC m=+0.110988345 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 26 10:05:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:54 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004500 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:54.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:54.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:54 compute-1 ceph-mon[80107]: pgmap v707: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 24 KiB/s rd, 3.1 MiB/s wr, 34 op/s
Jan 26 10:05:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:55 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328003760 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:55 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:56 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03440041a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:56.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:05:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:56.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:05:56 compute-1 ceph-mon[80107]: pgmap v708: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 24 KiB/s rd, 3.1 MiB/s wr, 35 op/s
Jan 26 10:05:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:57 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:57 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328003760 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 10:05:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3630123591' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:05:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 10:05:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3630123591' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:05:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:05:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:58 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:05:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:58.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:05:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:05:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:05:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:58.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:05:58 compute-1 ceph-mon[80107]: pgmap v709: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 2.5 MiB/s wr, 28 op/s
Jan 26 10:05:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/3630123591' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:05:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/3630123591' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:05:59 compute-1 sshd-session[229000]: Connection closed by authenticating user root 178.62.249.31 port 41994 [preauth]
Jan 26 10:05:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:59 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03440041c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:05:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:59 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004540 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:00 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:06:00 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328003760 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:00.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:00.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:00 compute-1 ceph-mon[80107]: pgmap v710: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 2.2 MiB/s wr, 24 op/s
Jan 26 10:06:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:06:01 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:01 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:06:01 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03440041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:02 compute-1 sudo[229006]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:06:02 compute-1 sudo[229006]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:06:02 compute-1 sudo[229006]: pam_unix(sudo:session): session closed for user root
Jan 26 10:06:02 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:06:02 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c002930 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:02.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:02.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:02 compute-1 ceph-mon[80107]: pgmap v711: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 16 KiB/s rd, 2.1 MiB/s wr, 23 op/s
Jan 26 10:06:03 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:06:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:06:03 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c002930 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:03 compute-1 ceph-mon[80107]: pgmap v712: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 B/s wr, 0 op/s
Jan 26 10:06:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:06:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:06:03 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:06:04 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03440041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:04 compute-1 sshd-session[229032]: Invalid user ubuntu from 104.248.198.71 port 59284
Jan 26 10:06:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 10:06:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:04.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 10:06:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 10:06:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:04.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 10:06:04 compute-1 sshd-session[229032]: Connection closed by invalid user ubuntu 104.248.198.71 port 59284 [preauth]
Jan 26 10:06:05 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:05.265 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 10:06:05 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:05.266 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 10:06:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:06:05 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03440041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:05 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:06:05 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328004470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:06 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:06:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:06.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:06 compute-1 ceph-mon[80107]: pgmap v713: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 26 10:06:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:06:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:06.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:06:07 compute-1 kernel: ganesha.nfsd[228145]: segfault at 50 ip 00007f03d675032e sp 00007f03367fb210 error 4 in libntirpc.so.5.8[7f03d6735000+2c000] likely on CPU 2 (core 0, socket 2)
Jan 26 10:06:07 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 26 10:06:07 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:06:07 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy ignored for local
Jan 26 10:06:07 compute-1 systemd[1]: Started Process Core Dump (PID 229036/UID 0).
Jan 26 10:06:08 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:08.268 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:06:08 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.348180) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421968348402, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 1058, "num_deletes": 255, "total_data_size": 2375899, "memory_usage": 2407856, "flush_reason": "Manual Compaction"}
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421968381347, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 1564177, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24013, "largest_seqno": 25066, "table_properties": {"data_size": 1559303, "index_size": 2398, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 10405, "raw_average_key_size": 19, "raw_value_size": 1549381, "raw_average_value_size": 2853, "num_data_blocks": 105, "num_entries": 543, "num_filter_entries": 543, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769421898, "oldest_key_time": 1769421898, "file_creation_time": 1769421968, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 33196 microseconds, and 10369 cpu microseconds.
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.381389) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 1564177 bytes OK
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.381408) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.382807) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.382822) EVENT_LOG_v1 {"time_micros": 1769421968382819, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.382838) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 2370650, prev total WAL file size 2370650, number of live WAL files 2.
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.383534) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323531' seq:72057594037927935, type:22 .. '6C6F676D00353032' seq:0, type:0; will stop at (end)
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(1527KB)], [45(11MB)]
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421968383584, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 13666022, "oldest_snapshot_seqno": -1}
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 5394 keys, 13454390 bytes, temperature: kUnknown
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421968456251, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 13454390, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13418066, "index_size": 21750, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13509, "raw_key_size": 137634, "raw_average_key_size": 25, "raw_value_size": 13320220, "raw_average_value_size": 2469, "num_data_blocks": 888, "num_entries": 5394, "num_filter_entries": 5394, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769421968, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.456607) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 13454390 bytes
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.463439) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 187.6 rd, 184.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 11.5 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(17.3) write-amplify(8.6) OK, records in: 5926, records dropped: 532 output_compression: NoCompression
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.463470) EVENT_LOG_v1 {"time_micros": 1769421968463456, "job": 26, "event": "compaction_finished", "compaction_time_micros": 72828, "compaction_time_cpu_micros": 24239, "output_level": 6, "num_output_files": 1, "total_output_size": 13454390, "num_input_records": 5926, "num_output_records": 5394, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421968464155, "job": 26, "event": "table_file_deletion", "file_number": 47}
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421968468953, "job": 26, "event": "table_file_deletion", "file_number": 45}
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.383459) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.469009) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.469016) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.469018) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.469020) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:06:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.469022) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:06:08 compute-1 ceph-mon[80107]: pgmap v714: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:06:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:06:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:08.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:06:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:08.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:08 compute-1 systemd-coredump[229037]: Process 227702 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 58:
                                                    #0  0x00007f03d675032e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    #1  0x0000000000000000 n/a (n/a + 0x0)
                                                    #2  0x00007f03d675a900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)
                                                    ELF object binary architecture: AMD x86-64
Jan 26 10:06:08 compute-1 systemd[1]: systemd-coredump@9-229036-0.service: Deactivated successfully.
Jan 26 10:06:08 compute-1 systemd[1]: systemd-coredump@9-229036-0.service: Consumed 1.214s CPU time.
Jan 26 10:06:08 compute-1 podman[229042]: 2026-01-26 10:06:08.990773162 +0000 UTC m=+0.028927394 container died da36348c21b1c64499d70f03079a8c10f8292749eea8acac8020d0941757a01f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Jan 26 10:06:09 compute-1 systemd[1]: var-lib-containers-storage-overlay-388b2e91ee2994701e041fb7b4f25f4a9af1107644f6559cc4d916cf9618d53c-merged.mount: Deactivated successfully.
Jan 26 10:06:09 compute-1 podman[229042]: 2026-01-26 10:06:09.145766336 +0000 UTC m=+0.183920548 container remove da36348c21b1c64499d70f03079a8c10f8292749eea8acac8020d0941757a01f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1)
Jan 26 10:06:09 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Main process exited, code=exited, status=139/n/a
Jan 26 10:06:09 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Failed with result 'exit-code'.
Jan 26 10:06:09 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.745s CPU time.
Jan 26 10:06:10 compute-1 podman[229087]: 2026-01-26 10:06:10.271429589 +0000 UTC m=+0.050698948 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent)
Jan 26 10:06:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:06:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:10.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:06:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:06:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:10.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:06:10 compute-1 ceph-mon[80107]: pgmap v715: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:06:12 compute-1 ceph-mon[80107]: pgmap v716: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 26 10:06:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:12.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:12.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:13 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:06:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100613 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 10:06:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:14.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:06:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:14.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:06:15 compute-1 ceph-mon[80107]: pgmap v717: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:06:16 compute-1 ceph-mon[80107]: pgmap v718: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 26 10:06:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:16.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:16.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:18 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:06:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:18.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:06:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:18.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:06:19 compute-1 ceph-mon[80107]: pgmap v719: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:06:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:06:19 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Scheduled restart job, restart counter is at 10.
Jan 26 10:06:19 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 10:06:19 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.745s CPU time.
Jan 26 10:06:19 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 10:06:19 compute-1 podman[229160]: 2026-01-26 10:06:19.61563392 +0000 UTC m=+0.020793704 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 10:06:19 compute-1 podman[229160]: 2026-01-26 10:06:19.776106385 +0000 UTC m=+0.181266129 container create 926ab88f4fc917023edfabab998a063a7c947db60c118100bca1f0cd441e3dca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 26 10:06:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6018c0b2554aae38b16b4fd14d15b48c1e664305ff0d647a5d04a7a440271808/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 10:06:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6018c0b2554aae38b16b4fd14d15b48c1e664305ff0d647a5d04a7a440271808/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 10:06:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6018c0b2554aae38b16b4fd14d15b48c1e664305ff0d647a5d04a7a440271808/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 10:06:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6018c0b2554aae38b16b4fd14d15b48c1e664305ff0d647a5d04a7a440271808/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 10:06:20 compute-1 podman[229160]: 2026-01-26 10:06:20.22839519 +0000 UTC m=+0.633554974 container init 926ab88f4fc917023edfabab998a063a7c947db60c118100bca1f0cd441e3dca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 10:06:20 compute-1 podman[229160]: 2026-01-26 10:06:20.233725377 +0000 UTC m=+0.638885131 container start 926ab88f4fc917023edfabab998a063a7c947db60c118100bca1f0cd441e3dca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325)
Jan 26 10:06:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:20 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 10:06:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:20 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 10:06:20 compute-1 bash[229160]: 926ab88f4fc917023edfabab998a063a7c947db60c118100bca1f0cd441e3dca
Jan 26 10:06:20 compute-1 ceph-mon[80107]: pgmap v720: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:06:20 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 10:06:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:20 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 26 10:06:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:20 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 26 10:06:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:20 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 26 10:06:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:20 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 26 10:06:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:20 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 26 10:06:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:20 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:06:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:20.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:20.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:21 compute-1 sudo[229218]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:06:21 compute-1 sudo[229218]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:06:21 compute-1 sudo[229218]: pam_unix(sudo:session): session closed for user root
Jan 26 10:06:21 compute-1 sudo[229243]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:06:21 compute-1 sudo[229243]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:06:21 compute-1 sudo[229243]: pam_unix(sudo:session): session closed for user root
Jan 26 10:06:21 compute-1 nova_compute[226322]: 2026-01-26 10:06:21.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:06:21 compute-1 nova_compute[226322]: 2026-01-26 10:06:21.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 26 10:06:21 compute-1 nova_compute[226322]: 2026-01-26 10:06:21.714 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 26 10:06:21 compute-1 nova_compute[226322]: 2026-01-26 10:06:21.715 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:06:21 compute-1 nova_compute[226322]: 2026-01-26 10:06:21.715 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 26 10:06:21 compute-1 nova_compute[226322]: 2026-01-26 10:06:21.737 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:06:22 compute-1 sudo[229301]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:06:22 compute-1 sudo[229301]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:06:22 compute-1 sudo[229301]: pam_unix(sudo:session): session closed for user root
Jan 26 10:06:22 compute-1 ceph-mon[80107]: pgmap v721: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 26 10:06:22 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:06:22 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:06:22 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:06:22 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:06:22 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:06:22 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:06:22 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:06:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:22.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:22.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:23 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:06:24 compute-1 ceph-mon[80107]: pgmap v722: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 26 10:06:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:06:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:24.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:06:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:24.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:25 compute-1 podman[229327]: 2026-01-26 10:06:25.308504951 +0000 UTC m=+0.085670328 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 26 10:06:25 compute-1 nova_compute[226322]: 2026-01-26 10:06:25.757 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:06:25 compute-1 nova_compute[226322]: 2026-01-26 10:06:25.757 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:06:25 compute-1 nova_compute[226322]: 2026-01-26 10:06:25.757 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:06:25 compute-1 nova_compute[226322]: 2026-01-26 10:06:25.758 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:06:25 compute-1 nova_compute[226322]: 2026-01-26 10:06:25.758 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:06:25 compute-1 nova_compute[226322]: 2026-01-26 10:06:25.758 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:06:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:26 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:06:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:26 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:06:26 compute-1 nova_compute[226322]: 2026-01-26 10:06:26.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:06:26 compute-1 nova_compute[226322]: 2026-01-26 10:06:26.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:06:26 compute-1 nova_compute[226322]: 2026-01-26 10:06:26.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:06:26 compute-1 nova_compute[226322]: 2026-01-26 10:06:26.706 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 10:06:26 compute-1 nova_compute[226322]: 2026-01-26 10:06:26.706 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:06:26 compute-1 nova_compute[226322]: 2026-01-26 10:06:26.707 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:06:26 compute-1 ceph-mon[80107]: pgmap v723: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 597 B/s wr, 1 op/s
Jan 26 10:06:26 compute-1 nova_compute[226322]: 2026-01-26 10:06:26.739 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:06:26 compute-1 nova_compute[226322]: 2026-01-26 10:06:26.739 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:06:26 compute-1 nova_compute[226322]: 2026-01-26 10:06:26.739 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:06:26 compute-1 nova_compute[226322]: 2026-01-26 10:06:26.739 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:06:26 compute-1 nova_compute[226322]: 2026-01-26 10:06:26.740 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:06:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:06:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:26.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:06:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 10:06:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:26.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 10:06:27 compute-1 sudo[229377]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:06:27 compute-1 sudo[229377]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:06:27 compute-1 sudo[229377]: pam_unix(sudo:session): session closed for user root
Jan 26 10:06:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:06:27 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1480749292' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:06:27 compute-1 nova_compute[226322]: 2026-01-26 10:06:27.237 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:06:27 compute-1 nova_compute[226322]: 2026-01-26 10:06:27.394 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:06:27 compute-1 nova_compute[226322]: 2026-01-26 10:06:27.395 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5227MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:06:27 compute-1 nova_compute[226322]: 2026-01-26 10:06:27.395 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:06:27 compute-1 nova_compute[226322]: 2026-01-26 10:06:27.396 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:06:27 compute-1 nova_compute[226322]: 2026-01-26 10:06:27.595 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:06:27 compute-1 nova_compute[226322]: 2026-01-26 10:06:27.595 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:06:27 compute-1 nova_compute[226322]: 2026-01-26 10:06:27.650 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing inventories for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 26 10:06:27 compute-1 nova_compute[226322]: 2026-01-26 10:06:27.671 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating ProviderTree inventory for provider d06842a0-5d13-4573-bb78-d433bbb380e4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 26 10:06:27 compute-1 nova_compute[226322]: 2026-01-26 10:06:27.671 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating inventory in ProviderTree for provider d06842a0-5d13-4573-bb78-d433bbb380e4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 10:06:27 compute-1 nova_compute[226322]: 2026-01-26 10:06:27.685 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing aggregate associations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 26 10:06:27 compute-1 nova_compute[226322]: 2026-01-26 10:06:27.711 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing trait associations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4, traits: HW_CPU_X86_CLMUL,HW_CPU_X86_SSE,HW_CPU_X86_AVX2,HW_CPU_X86_SHA,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_FMA3,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SVM,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 26 10:06:27 compute-1 sshd-session[229355]: Connection closed by authenticating user root 152.42.136.110 port 60304 [preauth]
Jan 26 10:06:27 compute-1 nova_compute[226322]: 2026-01-26 10:06:27.769 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:06:28 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:06:28 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:06:28 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1480749292' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:06:28 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3328335903' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:06:28 compute-1 ceph-mon[80107]: pgmap v724: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 26 10:06:28 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:06:28 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4136502083' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:06:28 compute-1 nova_compute[226322]: 2026-01-26 10:06:28.258 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:06:28 compute-1 nova_compute[226322]: 2026-01-26 10:06:28.265 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:06:28 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:06:28 compute-1 nova_compute[226322]: 2026-01-26 10:06:28.334 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:06:28 compute-1 nova_compute[226322]: 2026-01-26 10:06:28.335 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:06:28 compute-1 nova_compute[226322]: 2026-01-26 10:06:28.336 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.940s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:06:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:06:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:28.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:06:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:28.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:29 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/4136502083' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:06:29 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/211135410' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:06:29 compute-1 nova_compute[226322]: 2026-01-26 10:06:29.316 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:06:30 compute-1 ceph-mon[80107]: pgmap v725: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 26 10:06:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:30.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:30.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:31 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3629389423' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:06:31 compute-1 nova_compute[226322]: 2026-01-26 10:06:31.322 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:06:31 compute-1 nova_compute[226322]: 2026-01-26 10:06:31.323 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:06:31 compute-1 nova_compute[226322]: 2026-01-26 10:06:31.340 226326 DEBUG nova.compute.manager [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 10:06:31 compute-1 nova_compute[226322]: 2026-01-26 10:06:31.433 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:06:31 compute-1 nova_compute[226322]: 2026-01-26 10:06:31.434 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:06:31 compute-1 nova_compute[226322]: 2026-01-26 10:06:31.439 226326 DEBUG nova.virt.hardware [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 10:06:31 compute-1 nova_compute[226322]: 2026-01-26 10:06:31.439 226326 INFO nova.compute.claims [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Claim successful on node compute-1.ctlplane.example.com
Jan 26 10:06:31 compute-1 nova_compute[226322]: 2026-01-26 10:06:31.554 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:06:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:06:32 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3240052794' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:06:32 compute-1 nova_compute[226322]: 2026-01-26 10:06:32.027 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:06:32 compute-1 nova_compute[226322]: 2026-01-26 10:06:32.034 226326 DEBUG nova.compute.provider_tree [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:06:32 compute-1 nova_compute[226322]: 2026-01-26 10:06:32.059 226326 DEBUG nova.scheduler.client.report [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:06:32 compute-1 nova_compute[226322]: 2026-01-26 10:06:32.173 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.739s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:06:32 compute-1 nova_compute[226322]: 2026-01-26 10:06:32.174 226326 DEBUG nova.compute.manager [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 10:06:32 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3327039981' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:06:32 compute-1 ceph-mon[80107]: pgmap v726: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 10:06:32 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3240052794' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:06:32 compute-1 nova_compute[226322]: 2026-01-26 10:06:32.339 226326 DEBUG nova.compute.manager [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 10:06:32 compute-1 nova_compute[226322]: 2026-01-26 10:06:32.340 226326 DEBUG nova.network.neutron [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 10:06:32 compute-1 nova_compute[226322]: 2026-01-26 10:06:32.369 226326 INFO nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 10:06:32 compute-1 nova_compute[226322]: 2026-01-26 10:06:32.392 226326 DEBUG nova.compute.manager [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 10:06:32 compute-1 nova_compute[226322]: 2026-01-26 10:06:32.472 226326 DEBUG nova.compute.manager [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 10:06:32 compute-1 nova_compute[226322]: 2026-01-26 10:06:32.473 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 10:06:32 compute-1 nova_compute[226322]: 2026-01-26 10:06:32.474 226326 INFO nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Creating image(s)
Jan 26 10:06:32 compute-1 nova_compute[226322]: 2026-01-26 10:06:32.502 226326 DEBUG nova.storage.rbd_utils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:06:32 compute-1 nova_compute[226322]: 2026-01-26 10:06:32.541 226326 DEBUG nova.storage.rbd_utils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:06:32 compute-1 nova_compute[226322]: 2026-01-26 10:06:32.572 226326 DEBUG nova.storage.rbd_utils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:06:32 compute-1 nova_compute[226322]: 2026-01-26 10:06:32.575 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "d81880e926e175d0cc7241caa7cc18231a8a289c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:06:32 compute-1 nova_compute[226322]: 2026-01-26 10:06:32.576 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "d81880e926e175d0cc7241caa7cc18231a8a289c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:06:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:32.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:06:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:32.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:06:32 compute-1 nova_compute[226322]: 2026-01-26 10:06:32.878 226326 WARNING oslo_policy.policy [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Jan 26 10:06:32 compute-1 nova_compute[226322]: 2026-01-26 10:06:32.879 226326 WARNING oslo_policy.policy [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Jan 26 10:06:32 compute-1 nova_compute[226322]: 2026-01-26 10:06:32.881 226326 DEBUG nova.policy [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c1208d3e25b940ea93fe76884c7a53db', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 26 10:06:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 10:06:33 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:06:33 compute-1 nova_compute[226322]: 2026-01-26 10:06:33.303 226326 DEBUG nova.virt.libvirt.imagebackend [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Image locations are: [{'url': 'rbd://1a70b85d-e3fd-5814-8a6a-37ea00fcae30/images/6789692f-fc1f-4efa-ae75-dcc13be695ef/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://1a70b85d-e3fd-5814-8a6a-37ea00fcae30/images/6789692f-fc1f-4efa-ae75-dcc13be695ef/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085
Jan 26 10:06:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:33 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb314000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:33 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb30c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:34 compute-1 nova_compute[226322]: 2026-01-26 10:06:34.171 226326 DEBUG nova.network.neutron [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Successfully created port: 2a711799-8550-4432-83d1-93c9598eaa25 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 10:06:34 compute-1 nova_compute[226322]: 2026-01-26 10:06:34.279 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:06:34 compute-1 nova_compute[226322]: 2026-01-26 10:06:34.345 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c.part --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:06:34 compute-1 nova_compute[226322]: 2026-01-26 10:06:34.346 226326 DEBUG nova.virt.images [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] 6789692f-fc1f-4efa-ae75-dcc13be695ef was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Jan 26 10:06:34 compute-1 nova_compute[226322]: 2026-01-26 10:06:34.347 226326 DEBUG nova.privsep.utils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 26 10:06:34 compute-1 nova_compute[226322]: 2026-01-26 10:06:34.348 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c.part /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:06:34 compute-1 nova_compute[226322]: 2026-01-26 10:06:34.541 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c.part /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c.converted" returned: 0 in 0.193s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:06:34 compute-1 nova_compute[226322]: 2026-01-26 10:06:34.545 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:06:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:34 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:34 compute-1 nova_compute[226322]: 2026-01-26 10:06:34.601 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c.converted --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:06:34 compute-1 nova_compute[226322]: 2026-01-26 10:06:34.602 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "d81880e926e175d0cc7241caa7cc18231a8a289c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.026s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:06:34 compute-1 ceph-mon[80107]: pgmap v727: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 938 B/s wr, 3 op/s
Jan 26 10:06:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:06:34 compute-1 nova_compute[226322]: 2026-01-26 10:06:34.629 226326 DEBUG nova.storage.rbd_utils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:06:34 compute-1 nova_compute[226322]: 2026-01-26 10:06:34.632 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:06:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:06:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:34.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:06:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:06:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:34.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:06:35 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e149 e149: 3 total, 3 up, 3 in
Jan 26 10:06:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100635 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 10:06:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:35 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f0000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:35 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb318001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:35 compute-1 nova_compute[226322]: 2026-01-26 10:06:35.994 226326 DEBUG nova.network.neutron [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Successfully updated port: 2a711799-8550-4432-83d1-93c9598eaa25 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 10:06:36 compute-1 nova_compute[226322]: 2026-01-26 10:06:36.011 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 10:06:36 compute-1 nova_compute[226322]: 2026-01-26 10:06:36.012 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquired lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 10:06:36 compute-1 nova_compute[226322]: 2026-01-26 10:06:36.012 226326 DEBUG nova.network.neutron [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 10:06:36 compute-1 nova_compute[226322]: 2026-01-26 10:06:36.293 226326 DEBUG nova.network.neutron [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 10:06:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:36 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb30c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:36.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:06:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:36.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:06:36 compute-1 nova_compute[226322]: 2026-01-26 10:06:36.909 226326 DEBUG nova.compute.manager [req-56d9524d-3235-4684-af56-0b64b5c454a4 req-ae65cc06-2009-4386-9777-60979fa86b37 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Received event network-changed-2a711799-8550-4432-83d1-93c9598eaa25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:06:36 compute-1 nova_compute[226322]: 2026-01-26 10:06:36.910 226326 DEBUG nova.compute.manager [req-56d9524d-3235-4684-af56-0b64b5c454a4 req-ae65cc06-2009-4386-9777-60979fa86b37 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Refreshing instance network info cache due to event network-changed-2a711799-8550-4432-83d1-93c9598eaa25. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 10:06:36 compute-1 nova_compute[226322]: 2026-01-26 10:06:36.910 226326 DEBUG oslo_concurrency.lockutils [req-56d9524d-3235-4684-af56-0b64b5c454a4 req-ae65cc06-2009-4386-9777-60979fa86b37 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 10:06:36 compute-1 ceph-mon[80107]: pgmap v728: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1023 B/s wr, 11 op/s
Jan 26 10:06:36 compute-1 ceph-mon[80107]: osdmap e149: 3 total, 3 up, 3 in
Jan 26 10:06:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e150 e150: 3 total, 3 up, 3 in
Jan 26 10:06:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:37 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:37 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:38 compute-1 nova_compute[226322]: 2026-01-26 10:06:38.160 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.528s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:06:38 compute-1 nova_compute[226322]: 2026-01-26 10:06:38.249 226326 DEBUG nova.storage.rbd_utils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] resizing rbd image e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Jan 26 10:06:38 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:06:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:38 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb3180023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:38 compute-1 ceph-mon[80107]: osdmap e150: 3 total, 3 up, 3 in
Jan 26 10:06:38 compute-1 ceph-mon[80107]: pgmap v731: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 2.6 MiB/s rd, 639 B/s wr, 13 op/s
Jan 26 10:06:38 compute-1 nova_compute[226322]: 2026-01-26 10:06:38.725 226326 DEBUG nova.objects.instance [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'migration_context' on Instance uuid e89171d5-00d1-406a-bcc8-340bfdacbcbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 10:06:38 compute-1 nova_compute[226322]: 2026-01-26 10:06:38.743 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 26 10:06:38 compute-1 nova_compute[226322]: 2026-01-26 10:06:38.743 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Ensure instance console log exists: /var/lib/nova/instances/e89171d5-00d1-406a-bcc8-340bfdacbcbc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 26 10:06:38 compute-1 nova_compute[226322]: 2026-01-26 10:06:38.744 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:06:38 compute-1 nova_compute[226322]: 2026-01-26 10:06:38.744 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:06:38 compute-1 nova_compute[226322]: 2026-01-26 10:06:38.744 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:06:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:06:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:38.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:06:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 10:06:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:38.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 10:06:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:39 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb30c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:39 compute-1 nova_compute[226322]: 2026-01-26 10:06:39.768 226326 DEBUG nova.network.neutron [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Updating instance_info_cache with network_info: [{"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:06:39 compute-1 nova_compute[226322]: 2026-01-26 10:06:39.787 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Releasing lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 10:06:39 compute-1 nova_compute[226322]: 2026-01-26 10:06:39.788 226326 DEBUG nova.compute.manager [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Instance network_info: |[{"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 10:06:39 compute-1 nova_compute[226322]: 2026-01-26 10:06:39.788 226326 DEBUG oslo_concurrency.lockutils [req-56d9524d-3235-4684-af56-0b64b5c454a4 req-ae65cc06-2009-4386-9777-60979fa86b37 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquired lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 10:06:39 compute-1 nova_compute[226322]: 2026-01-26 10:06:39.788 226326 DEBUG nova.network.neutron [req-56d9524d-3235-4684-af56-0b64b5c454a4 req-ae65cc06-2009-4386-9777-60979fa86b37 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Refreshing network info cache for port 2a711799-8550-4432-83d1-93c9598eaa25 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 10:06:39 compute-1 nova_compute[226322]: 2026-01-26 10:06:39.791 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Start _get_guest_xml network_info=[{"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T10:05:39Z,direct_url=<?>,disk_format='qcow2',id=6789692f-fc1f-4efa-ae75-dcc13be695ef,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3ff3fa2a5531460b993c609589aa545d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T10:05:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'size': 0, 'image_id': '6789692f-fc1f-4efa-ae75-dcc13be695ef'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 26 10:06:39 compute-1 nova_compute[226322]: 2026-01-26 10:06:39.796 226326 WARNING nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:06:39 compute-1 nova_compute[226322]: 2026-01-26 10:06:39.801 226326 DEBUG nova.virt.libvirt.host [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 26 10:06:39 compute-1 nova_compute[226322]: 2026-01-26 10:06:39.802 226326 DEBUG nova.virt.libvirt.host [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 26 10:06:39 compute-1 nova_compute[226322]: 2026-01-26 10:06:39.809 226326 DEBUG nova.virt.libvirt.host [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 26 10:06:39 compute-1 nova_compute[226322]: 2026-01-26 10:06:39.809 226326 DEBUG nova.virt.libvirt.host [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 26 10:06:39 compute-1 nova_compute[226322]: 2026-01-26 10:06:39.810 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 26 10:06:39 compute-1 nova_compute[226322]: 2026-01-26 10:06:39.810 226326 DEBUG nova.virt.hardware [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T10:05:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='57e1601b-dbfa-4d3b-8b96-27302e4a7a06',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T10:05:39Z,direct_url=<?>,disk_format='qcow2',id=6789692f-fc1f-4efa-ae75-dcc13be695ef,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3ff3fa2a5531460b993c609589aa545d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T10:05:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 26 10:06:39 compute-1 nova_compute[226322]: 2026-01-26 10:06:39.810 226326 DEBUG nova.virt.hardware [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 26 10:06:39 compute-1 nova_compute[226322]: 2026-01-26 10:06:39.810 226326 DEBUG nova.virt.hardware [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 26 10:06:39 compute-1 nova_compute[226322]: 2026-01-26 10:06:39.811 226326 DEBUG nova.virt.hardware [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 26 10:06:39 compute-1 nova_compute[226322]: 2026-01-26 10:06:39.811 226326 DEBUG nova.virt.hardware [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 26 10:06:39 compute-1 nova_compute[226322]: 2026-01-26 10:06:39.811 226326 DEBUG nova.virt.hardware [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 26 10:06:39 compute-1 nova_compute[226322]: 2026-01-26 10:06:39.811 226326 DEBUG nova.virt.hardware [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 26 10:06:39 compute-1 nova_compute[226322]: 2026-01-26 10:06:39.812 226326 DEBUG nova.virt.hardware [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 26 10:06:39 compute-1 nova_compute[226322]: 2026-01-26 10:06:39.812 226326 DEBUG nova.virt.hardware [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 26 10:06:39 compute-1 nova_compute[226322]: 2026-01-26 10:06:39.812 226326 DEBUG nova.virt.hardware [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 26 10:06:39 compute-1 nova_compute[226322]: 2026-01-26 10:06:39.812 226326 DEBUG nova.virt.hardware [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 26 10:06:39 compute-1 nova_compute[226322]: 2026-01-26 10:06:39.815 226326 DEBUG nova.privsep.utils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 26 10:06:39 compute-1 nova_compute[226322]: 2026-01-26 10:06:39.816 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:06:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:39 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:40 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 26 10:06:40 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4246575752' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.307 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.327 226326 DEBUG nova.storage.rbd_utils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.331 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:06:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:40 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:40 compute-1 ceph-mon[80107]: pgmap v732: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 2.6 MiB/s rd, 127 B/s wr, 10 op/s
Jan 26 10:06:40 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/4246575752' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:06:40 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 26 10:06:40 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3882837298' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.785 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.787 226326 DEBUG nova.virt.libvirt.vif [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T10:06:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1457730655',display_name='tempest-TestNetworkBasicOps-server-1457730655',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1457730655',id=1,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCzAq4PU9BtYDVE2893QINCbdmYXSOCaAokqiTddkRve8RypnqtELDxIf91MHtzezEtPVmgAZuG9SeAnge4WS1FhmBBt7X/P4oQbB5mob/LapxwgpslluUHdSteOPotImA==',key_name='tempest-TestNetworkBasicOps-1466537094',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-snk42cm3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T10:06:32Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=e89171d5-00d1-406a-bcc8-340bfdacbcbc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.788 226326 DEBUG nova.network.os_vif_util [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.789 226326 DEBUG nova.network.os_vif_util [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:cd:f5,bridge_name='br-int',has_traffic_filtering=True,id=2a711799-8550-4432-83d1-93c9598eaa25,network=Network(7939c8f9-060e-41e0-9e41-ebecf5f62dfb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a711799-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.792 226326 DEBUG nova.objects.instance [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'pci_devices' on Instance uuid e89171d5-00d1-406a-bcc8-340bfdacbcbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.816 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] End _get_guest_xml xml=<domain type="kvm">
Jan 26 10:06:40 compute-1 nova_compute[226322]:   <uuid>e89171d5-00d1-406a-bcc8-340bfdacbcbc</uuid>
Jan 26 10:06:40 compute-1 nova_compute[226322]:   <name>instance-00000001</name>
Jan 26 10:06:40 compute-1 nova_compute[226322]:   <memory>131072</memory>
Jan 26 10:06:40 compute-1 nova_compute[226322]:   <vcpu>1</vcpu>
Jan 26 10:06:40 compute-1 nova_compute[226322]:   <metadata>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <nova:name>tempest-TestNetworkBasicOps-server-1457730655</nova:name>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <nova:creationTime>2026-01-26 10:06:39</nova:creationTime>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <nova:flavor name="m1.nano">
Jan 26 10:06:40 compute-1 nova_compute[226322]:         <nova:memory>128</nova:memory>
Jan 26 10:06:40 compute-1 nova_compute[226322]:         <nova:disk>1</nova:disk>
Jan 26 10:06:40 compute-1 nova_compute[226322]:         <nova:swap>0</nova:swap>
Jan 26 10:06:40 compute-1 nova_compute[226322]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 10:06:40 compute-1 nova_compute[226322]:         <nova:vcpus>1</nova:vcpus>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       </nova:flavor>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <nova:owner>
Jan 26 10:06:40 compute-1 nova_compute[226322]:         <nova:user uuid="c1208d3e25b940ea93fe76884c7a53db">tempest-TestNetworkBasicOps-966559857-project-member</nova:user>
Jan 26 10:06:40 compute-1 nova_compute[226322]:         <nova:project uuid="6ed221b375a44fc2bb2a8f232c5446e7">tempest-TestNetworkBasicOps-966559857</nova:project>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       </nova:owner>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <nova:root type="image" uuid="6789692f-fc1f-4efa-ae75-dcc13be695ef"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <nova:ports>
Jan 26 10:06:40 compute-1 nova_compute[226322]:         <nova:port uuid="2a711799-8550-4432-83d1-93c9598eaa25">
Jan 26 10:06:40 compute-1 nova_compute[226322]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:         </nova:port>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       </nova:ports>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     </nova:instance>
Jan 26 10:06:40 compute-1 nova_compute[226322]:   </metadata>
Jan 26 10:06:40 compute-1 nova_compute[226322]:   <sysinfo type="smbios">
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <system>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <entry name="manufacturer">RDO</entry>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <entry name="product">OpenStack Compute</entry>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <entry name="serial">e89171d5-00d1-406a-bcc8-340bfdacbcbc</entry>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <entry name="uuid">e89171d5-00d1-406a-bcc8-340bfdacbcbc</entry>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <entry name="family">Virtual Machine</entry>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     </system>
Jan 26 10:06:40 compute-1 nova_compute[226322]:   </sysinfo>
Jan 26 10:06:40 compute-1 nova_compute[226322]:   <os>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <boot dev="hd"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <smbios mode="sysinfo"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:   </os>
Jan 26 10:06:40 compute-1 nova_compute[226322]:   <features>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <acpi/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <apic/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <vmcoreinfo/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:   </features>
Jan 26 10:06:40 compute-1 nova_compute[226322]:   <clock offset="utc">
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <timer name="hpet" present="no"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:   </clock>
Jan 26 10:06:40 compute-1 nova_compute[226322]:   <cpu mode="host-model" match="exact">
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:   </cpu>
Jan 26 10:06:40 compute-1 nova_compute[226322]:   <devices>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <disk type="network" device="disk">
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <driver type="raw" cache="none"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <source protocol="rbd" name="vms/e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk">
Jan 26 10:06:40 compute-1 nova_compute[226322]:         <host name="192.168.122.100" port="6789"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:         <host name="192.168.122.102" port="6789"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:         <host name="192.168.122.101" port="6789"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       </source>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <auth username="openstack">
Jan 26 10:06:40 compute-1 nova_compute[226322]:         <secret type="ceph" uuid="1a70b85d-e3fd-5814-8a6a-37ea00fcae30"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       </auth>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <target dev="vda" bus="virtio"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     </disk>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <disk type="network" device="cdrom">
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <driver type="raw" cache="none"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <source protocol="rbd" name="vms/e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk.config">
Jan 26 10:06:40 compute-1 nova_compute[226322]:         <host name="192.168.122.100" port="6789"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:         <host name="192.168.122.102" port="6789"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:         <host name="192.168.122.101" port="6789"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       </source>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <auth username="openstack">
Jan 26 10:06:40 compute-1 nova_compute[226322]:         <secret type="ceph" uuid="1a70b85d-e3fd-5814-8a6a-37ea00fcae30"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       </auth>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <target dev="sda" bus="sata"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     </disk>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <interface type="ethernet">
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <mac address="fa:16:3e:30:cd:f5"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <model type="virtio"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <mtu size="1442"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <target dev="tap2a711799-85"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     </interface>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <serial type="pty">
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <log file="/var/lib/nova/instances/e89171d5-00d1-406a-bcc8-340bfdacbcbc/console.log" append="off"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     </serial>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <video>
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <model type="virtio"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     </video>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <input type="tablet" bus="usb"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <rng model="virtio">
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <backend model="random">/dev/urandom</backend>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     </rng>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <controller type="usb" index="0"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     <memballoon model="virtio">
Jan 26 10:06:40 compute-1 nova_compute[226322]:       <stats period="10"/>
Jan 26 10:06:40 compute-1 nova_compute[226322]:     </memballoon>
Jan 26 10:06:40 compute-1 nova_compute[226322]:   </devices>
Jan 26 10:06:40 compute-1 nova_compute[226322]: </domain>
Jan 26 10:06:40 compute-1 nova_compute[226322]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.818 226326 DEBUG nova.compute.manager [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Preparing to wait for external event network-vif-plugged-2a711799-8550-4432-83d1-93c9598eaa25 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.819 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.819 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.819 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.820 226326 DEBUG nova.virt.libvirt.vif [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T10:06:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1457730655',display_name='tempest-TestNetworkBasicOps-server-1457730655',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1457730655',id=1,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCzAq4PU9BtYDVE2893QINCbdmYXSOCaAokqiTddkRve8RypnqtELDxIf91MHtzezEtPVmgAZuG9SeAnge4WS1FhmBBt7X/P4oQbB5mob/LapxwgpslluUHdSteOPotImA==',key_name='tempest-TestNetworkBasicOps-1466537094',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-snk42cm3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T10:06:32Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=e89171d5-00d1-406a-bcc8-340bfdacbcbc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.820 226326 DEBUG nova.network.os_vif_util [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.821 226326 DEBUG nova.network.os_vif_util [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:cd:f5,bridge_name='br-int',has_traffic_filtering=True,id=2a711799-8550-4432-83d1-93c9598eaa25,network=Network(7939c8f9-060e-41e0-9e41-ebecf5f62dfb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a711799-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.821 226326 DEBUG os_vif [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:cd:f5,bridge_name='br-int',has_traffic_filtering=True,id=2a711799-8550-4432-83d1-93c9598eaa25,network=Network(7939c8f9-060e-41e0-9e41-ebecf5f62dfb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a711799-85') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 10:06:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:06:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:40.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:06:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:06:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:40.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.866 226326 DEBUG ovsdbapp.backend.ovs_idl [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.867 226326 DEBUG ovsdbapp.backend.ovs_idl [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.867 226326 DEBUG ovsdbapp.backend.ovs_idl [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.868 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.868 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [POLLOUT] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.869 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.870 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.886 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.886 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.886 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 10:06:40 compute-1 nova_compute[226322]: 2026-01-26 10:06:40.887 226326 INFO oslo.privsep.daemon [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpjhe3kfhi/privsep.sock']
Jan 26 10:06:41 compute-1 podman[229714]: 2026-01-26 10:06:41.310459129 +0000 UTC m=+0.083383460 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 26 10:06:41 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3882837298' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:06:41 compute-1 nova_compute[226322]: 2026-01-26 10:06:41.577 226326 INFO oslo.privsep.daemon [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Spawned new privsep daemon via rootwrap
Jan 26 10:06:41 compute-1 nova_compute[226322]: 2026-01-26 10:06:41.455 229734 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 26 10:06:41 compute-1 nova_compute[226322]: 2026-01-26 10:06:41.459 229734 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 26 10:06:41 compute-1 nova_compute[226322]: 2026-01-26 10:06:41.461 229734 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Jan 26 10:06:41 compute-1 nova_compute[226322]: 2026-01-26 10:06:41.461 229734 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229734
Jan 26 10:06:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:41 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb3180023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:41 compute-1 nova_compute[226322]: 2026-01-26 10:06:41.908 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:06:41 compute-1 nova_compute[226322]: 2026-01-26 10:06:41.909 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2a711799-85, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:06:41 compute-1 nova_compute[226322]: 2026-01-26 10:06:41.909 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2a711799-85, col_values=(('external_ids', {'iface-id': '2a711799-8550-4432-83d1-93c9598eaa25', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:30:cd:f5', 'vm-uuid': 'e89171d5-00d1-406a-bcc8-340bfdacbcbc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:06:41 compute-1 NetworkManager[49073]: <info>  [1769422001.9130] manager: (tap2a711799-85): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Jan 26 10:06:41 compute-1 nova_compute[226322]: 2026-01-26 10:06:41.913 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:06:41 compute-1 nova_compute[226322]: 2026-01-26 10:06:41.921 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:06:41 compute-1 nova_compute[226322]: 2026-01-26 10:06:41.922 226326 INFO os_vif [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:cd:f5,bridge_name='br-int',has_traffic_filtering=True,id=2a711799-8550-4432-83d1-93c9598eaa25,network=Network(7939c8f9-060e-41e0-9e41-ebecf5f62dfb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a711799-85')
Jan 26 10:06:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:41 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb30c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:41 compute-1 nova_compute[226322]: 2026-01-26 10:06:41.981 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 10:06:41 compute-1 nova_compute[226322]: 2026-01-26 10:06:41.982 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 10:06:41 compute-1 nova_compute[226322]: 2026-01-26 10:06:41.982 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No VIF found with MAC fa:16:3e:30:cd:f5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 26 10:06:41 compute-1 nova_compute[226322]: 2026-01-26 10:06:41.982 226326 INFO nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Using config drive
Jan 26 10:06:42 compute-1 nova_compute[226322]: 2026-01-26 10:06:42.007 226326 DEBUG nova.storage.rbd_utils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:06:42 compute-1 sudo[229761]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:06:42 compute-1 sudo[229761]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:06:42 compute-1 sudo[229761]: pam_unix(sudo:session): session closed for user root
Jan 26 10:06:42 compute-1 nova_compute[226322]: 2026-01-26 10:06:42.317 226326 DEBUG nova.network.neutron [req-56d9524d-3235-4684-af56-0b64b5c454a4 req-ae65cc06-2009-4386-9777-60979fa86b37 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Updated VIF entry in instance network info cache for port 2a711799-8550-4432-83d1-93c9598eaa25. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 10:06:42 compute-1 nova_compute[226322]: 2026-01-26 10:06:42.318 226326 DEBUG nova.network.neutron [req-56d9524d-3235-4684-af56-0b64b5c454a4 req-ae65cc06-2009-4386-9777-60979fa86b37 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Updating instance_info_cache with network_info: [{"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:06:42 compute-1 nova_compute[226322]: 2026-01-26 10:06:42.336 226326 DEBUG oslo_concurrency.lockutils [req-56d9524d-3235-4684-af56-0b64b5c454a4 req-ae65cc06-2009-4386-9777-60979fa86b37 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Releasing lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 10:06:42 compute-1 nova_compute[226322]: 2026-01-26 10:06:42.494 226326 INFO nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Creating config drive at /var/lib/nova/instances/e89171d5-00d1-406a-bcc8-340bfdacbcbc/disk.config
Jan 26 10:06:42 compute-1 nova_compute[226322]: 2026-01-26 10:06:42.498 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e89171d5-00d1-406a-bcc8-340bfdacbcbc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp16dhfavv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:06:42 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:42 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:42 compute-1 ceph-mon[80107]: pgmap v733: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.6 MiB/s rd, 2.7 MiB/s wr, 51 op/s
Jan 26 10:06:42 compute-1 nova_compute[226322]: 2026-01-26 10:06:42.628 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e89171d5-00d1-406a-bcc8-340bfdacbcbc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp16dhfavv" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:06:42 compute-1 nova_compute[226322]: 2026-01-26 10:06:42.659 226326 DEBUG nova.storage.rbd_utils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:06:42 compute-1 nova_compute[226322]: 2026-01-26 10:06:42.663 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e89171d5-00d1-406a-bcc8-340bfdacbcbc/disk.config e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:06:42 compute-1 nova_compute[226322]: 2026-01-26 10:06:42.792 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e89171d5-00d1-406a-bcc8-340bfdacbcbc/disk.config e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:06:42 compute-1 nova_compute[226322]: 2026-01-26 10:06:42.793 226326 INFO nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Deleting local config drive /var/lib/nova/instances/e89171d5-00d1-406a-bcc8-340bfdacbcbc/disk.config because it was imported into RBD.
Jan 26 10:06:42 compute-1 systemd[1]: Starting libvirt secret daemon...
Jan 26 10:06:42 compute-1 systemd[1]: Started libvirt secret daemon.
Jan 26 10:06:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:42.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:06:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:42.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:06:42 compute-1 kernel: tun: Universal TUN/TAP device driver, 1.6
Jan 26 10:06:42 compute-1 kernel: tap2a711799-85: entered promiscuous mode
Jan 26 10:06:42 compute-1 NetworkManager[49073]: <info>  [1769422002.8971] manager: (tap2a711799-85): new Tun device (/org/freedesktop/NetworkManager/Devices/24)
Jan 26 10:06:42 compute-1 ovn_controller[133670]: 2026-01-26T10:06:42Z|00027|binding|INFO|Claiming lport 2a711799-8550-4432-83d1-93c9598eaa25 for this chassis.
Jan 26 10:06:42 compute-1 ovn_controller[133670]: 2026-01-26T10:06:42Z|00028|binding|INFO|2a711799-8550-4432-83d1-93c9598eaa25: Claiming fa:16:3e:30:cd:f5 10.100.0.8
Jan 26 10:06:42 compute-1 nova_compute[226322]: 2026-01-26 10:06:42.899 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:06:42 compute-1 nova_compute[226322]: 2026-01-26 10:06:42.903 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:06:42 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:42.923 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:cd:f5 10.100.0.8'], port_security=['fa:16:3e:30:cd:f5 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'e89171d5-00d1-406a-bcc8-340bfdacbcbc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7939c8f9-060e-41e0-9e41-ebecf5f62dfb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a9f16b9a-82ea-4f86-8e79-c292f5efcb52', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1aeef230-e621-407c-aaae-8f0a628ce92a, chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], logical_port=2a711799-8550-4432-83d1-93c9598eaa25) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 10:06:42 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:42.924 143326 INFO neutron.agent.ovn.metadata.agent [-] Port 2a711799-8550-4432-83d1-93c9598eaa25 in datapath 7939c8f9-060e-41e0-9e41-ebecf5f62dfb bound to our chassis
Jan 26 10:06:42 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:42.926 143326 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7939c8f9-060e-41e0-9e41-ebecf5f62dfb
Jan 26 10:06:42 compute-1 systemd-udevd[229860]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 10:06:42 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:42.928 143326 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpgkojd4t6/privsep.sock']
Jan 26 10:06:42 compute-1 systemd-machined[194876]: New machine qemu-1-instance-00000001.
Jan 26 10:06:42 compute-1 NetworkManager[49073]: <info>  [1769422002.9602] device (tap2a711799-85): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 10:06:42 compute-1 NetworkManager[49073]: <info>  [1769422002.9610] device (tap2a711799-85): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 10:06:42 compute-1 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Jan 26 10:06:42 compute-1 nova_compute[226322]: 2026-01-26 10:06:42.982 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:06:42 compute-1 ovn_controller[133670]: 2026-01-26T10:06:42Z|00029|binding|INFO|Setting lport 2a711799-8550-4432-83d1-93c9598eaa25 ovn-installed in OVS
Jan 26 10:06:42 compute-1 ovn_controller[133670]: 2026-01-26T10:06:42Z|00030|binding|INFO|Setting lport 2a711799-8550-4432-83d1-93c9598eaa25 up in Southbound
Jan 26 10:06:42 compute-1 nova_compute[226322]: 2026-01-26 10:06:42.990 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:06:43 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 e151: 3 total, 3 up, 3 in
Jan 26 10:06:43 compute-1 sshd-session[229741]: Connection closed by authenticating user root 178.62.249.31 port 51962 [preauth]
Jan 26 10:06:43 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.601 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422003.6012678, e89171d5-00d1-406a-bcc8-340bfdacbcbc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.603 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] VM Started (Lifecycle Event)
Jan 26 10:06:43 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:43.621 143326 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 26 10:06:43 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:43.621 143326 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpgkojd4t6/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 26 10:06:43 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:43.492 229912 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 26 10:06:43 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:43.496 229912 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 26 10:06:43 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:43.498 229912 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Jan 26 10:06:43 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:43.498 229912 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229912
Jan 26 10:06:43 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:43.624 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[de62acc8-a100-475c-9a1e-cad0993d7715]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.636 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.640 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422003.6014047, e89171d5-00d1-406a-bcc8-340bfdacbcbc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.640 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] VM Paused (Lifecycle Event)
Jan 26 10:06:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:43 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.655 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.658 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.669 226326 DEBUG nova.compute.manager [req-b9b40dd8-e449-4f14-aea2-1bbc655792ab req-6d5fd51a-6c7a-44c1-849b-9b3133aa086f b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Received event network-vif-plugged-2a711799-8550-4432-83d1-93c9598eaa25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.670 226326 DEBUG oslo_concurrency.lockutils [req-b9b40dd8-e449-4f14-aea2-1bbc655792ab req-6d5fd51a-6c7a-44c1-849b-9b3133aa086f b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.670 226326 DEBUG oslo_concurrency.lockutils [req-b9b40dd8-e449-4f14-aea2-1bbc655792ab req-6d5fd51a-6c7a-44c1-849b-9b3133aa086f b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.670 226326 DEBUG oslo_concurrency.lockutils [req-b9b40dd8-e449-4f14-aea2-1bbc655792ab req-6d5fd51a-6c7a-44c1-849b-9b3133aa086f b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.670 226326 DEBUG nova.compute.manager [req-b9b40dd8-e449-4f14-aea2-1bbc655792ab req-6d5fd51a-6c7a-44c1-849b-9b3133aa086f b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Processing event network-vif-plugged-2a711799-8550-4432-83d1-93c9598eaa25 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.671 226326 DEBUG nova.compute.manager [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.676 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.683 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.684 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422003.68418, e89171d5-00d1-406a-bcc8-340bfdacbcbc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.684 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] VM Resumed (Lifecycle Event)
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.688 226326 INFO nova.virt.libvirt.driver [-] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Instance spawned successfully.
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.688 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.706 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.710 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.732 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.739 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.740 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.740 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.741 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.741 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.741 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.788 226326 INFO nova.compute.manager [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Took 11.32 seconds to spawn the instance on the hypervisor.
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.789 226326 DEBUG nova.compute.manager [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.859 226326 INFO nova.compute.manager [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Took 12.45 seconds to build instance.
Jan 26 10:06:43 compute-1 nova_compute[226322]: 2026-01-26 10:06:43.876 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.553s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:06:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:43 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb3180023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:44 compute-1 nova_compute[226322]: 2026-01-26 10:06:44.118 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:06:44 compute-1 ceph-mon[80107]: osdmap e151: 3 total, 3 up, 3 in
Jan 26 10:06:44 compute-1 ceph-mon[80107]: pgmap v735: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 2.7 MiB/s wr, 41 op/s
Jan 26 10:06:44 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:44.179 229912 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:06:44 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:44.180 229912 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:06:44 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:44.180 229912 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:06:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:44 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb30c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:44 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:44.820 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[fdf68adb-7f57-4cfc-b08e-eb481667f189]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:06:44 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:44.821 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7939c8f9-01 in ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 26 10:06:44 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:44.822 229912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7939c8f9-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 26 10:06:44 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:44.823 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[45167cc3-7ace-4a58-9942-18db37853d3c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:06:44 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:44.825 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[fd11499a-2f89-498c-8eda-303f3c41bb94]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:06:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:44.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:44 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:44.855 143615 DEBUG oslo.privsep.daemon [-] privsep: reply[8d897bd8-ed4e-4593-9d89-014b494662e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:06:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:44.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:44 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:44.875 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[ac74a3b3-a603-486a-b537-551e4786d458]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:06:44 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:44.877 143326 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpe_q9dabc/privsep.sock']
Jan 26 10:06:45 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:45.596 143326 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 26 10:06:45 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:45.597 143326 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpe_q9dabc/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 26 10:06:45 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:45.475 229936 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 26 10:06:45 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:45.482 229936 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 26 10:06:45 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:45.485 229936 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 26 10:06:45 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:45.486 229936 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229936
Jan 26 10:06:45 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:45.600 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[a8116edf-623e-4d9b-ab63-1d54fa580f66]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:06:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:45 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:45 compute-1 nova_compute[226322]: 2026-01-26 10:06:45.848 226326 DEBUG nova.compute.manager [req-9313bba1-32d9-41e4-970f-3d059f719f73 req-78b82fb4-0509-4f88-84e5-469c25415b92 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Received event network-vif-plugged-2a711799-8550-4432-83d1-93c9598eaa25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:06:45 compute-1 nova_compute[226322]: 2026-01-26 10:06:45.848 226326 DEBUG oslo_concurrency.lockutils [req-9313bba1-32d9-41e4-970f-3d059f719f73 req-78b82fb4-0509-4f88-84e5-469c25415b92 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:06:45 compute-1 nova_compute[226322]: 2026-01-26 10:06:45.848 226326 DEBUG oslo_concurrency.lockutils [req-9313bba1-32d9-41e4-970f-3d059f719f73 req-78b82fb4-0509-4f88-84e5-469c25415b92 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:06:45 compute-1 nova_compute[226322]: 2026-01-26 10:06:45.849 226326 DEBUG oslo_concurrency.lockutils [req-9313bba1-32d9-41e4-970f-3d059f719f73 req-78b82fb4-0509-4f88-84e5-469c25415b92 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:06:45 compute-1 nova_compute[226322]: 2026-01-26 10:06:45.849 226326 DEBUG nova.compute.manager [req-9313bba1-32d9-41e4-970f-3d059f719f73 req-78b82fb4-0509-4f88-84e5-469c25415b92 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] No waiting events found dispatching network-vif-plugged-2a711799-8550-4432-83d1-93c9598eaa25 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 10:06:45 compute-1 nova_compute[226322]: 2026-01-26 10:06:45.849 226326 WARNING nova.compute.manager [req-9313bba1-32d9-41e4-970f-3d059f719f73 req-78b82fb4-0509-4f88-84e5-469c25415b92 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Received unexpected event network-vif-plugged-2a711799-8550-4432-83d1-93c9598eaa25 for instance with vm_state active and task_state None.
Jan 26 10:06:45 compute-1 nova_compute[226322]: 2026-01-26 10:06:45.894 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:06:45 compute-1 NetworkManager[49073]: <info>  [1769422005.8951] manager: (patch-br-int-to-provnet-94d9950f-5cf2-4813-9455-dd14377245f4): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/25)
Jan 26 10:06:45 compute-1 NetworkManager[49073]: <info>  [1769422005.8960] device (patch-br-int-to-provnet-94d9950f-5cf2-4813-9455-dd14377245f4)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 10:06:45 compute-1 NetworkManager[49073]: <warn>  [1769422005.8961] device (patch-br-int-to-provnet-94d9950f-5cf2-4813-9455-dd14377245f4)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 10:06:45 compute-1 NetworkManager[49073]: <info>  [1769422005.8972] manager: (patch-provnet-94d9950f-5cf2-4813-9455-dd14377245f4-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/26)
Jan 26 10:06:45 compute-1 NetworkManager[49073]: <info>  [1769422005.8978] device (patch-provnet-94d9950f-5cf2-4813-9455-dd14377245f4-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 10:06:45 compute-1 NetworkManager[49073]: <warn>  [1769422005.8978] device (patch-provnet-94d9950f-5cf2-4813-9455-dd14377245f4-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 10:06:45 compute-1 NetworkManager[49073]: <info>  [1769422005.8989] manager: (patch-provnet-94d9950f-5cf2-4813-9455-dd14377245f4-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Jan 26 10:06:45 compute-1 NetworkManager[49073]: <info>  [1769422005.8998] manager: (patch-br-int-to-provnet-94d9950f-5cf2-4813-9455-dd14377245f4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Jan 26 10:06:45 compute-1 NetworkManager[49073]: <info>  [1769422005.9003] device (patch-br-int-to-provnet-94d9950f-5cf2-4813-9455-dd14377245f4)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 26 10:06:45 compute-1 NetworkManager[49073]: <info>  [1769422005.9009] device (patch-provnet-94d9950f-5cf2-4813-9455-dd14377245f4-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 26 10:06:45 compute-1 nova_compute[226322]: 2026-01-26 10:06:45.929 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:06:45 compute-1 nova_compute[226322]: 2026-01-26 10:06:45.933 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:06:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:45 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:46 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.184 229936 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:06:46 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.184 229936 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:06:46 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.184 229936 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:06:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:46 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb3180034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:46 compute-1 ceph-mon[80107]: pgmap v736: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.5 MiB/s wr, 127 op/s
Jan 26 10:06:46 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.783 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[7e0dda9a-a39d-49ce-a46b-cfcb9ee67f54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:06:46 compute-1 NetworkManager[49073]: <info>  [1769422006.8067] manager: (tap7939c8f9-00): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Jan 26 10:06:46 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.807 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[5c977bfe-dd9e-43c3-8015-9486fb2a6263]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:06:46 compute-1 systemd-udevd[229950]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 10:06:46 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.836 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[d5c57640-3e17-4625-a24c-b497b1f4d87d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:06:46 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.839 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[0c4235c3-33c7-42e2-b36a-6fbb57bd5614]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:06:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:46.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:46.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:46 compute-1 NetworkManager[49073]: <info>  [1769422006.8630] device (tap7939c8f9-00): carrier: link connected
Jan 26 10:06:46 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.866 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[435e14ff-4a25-48ca-809e-ba4741724ce0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:06:46 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.885 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[87e88ad4-3c08-4e9b-ba15-76e6e525ff26]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7939c8f9-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:20:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396742, 'reachable_time': 15416, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229968, 'error': None, 'target': 'ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:06:46 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.897 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[17130bb0-c757-47dc-ac66-387d30980e07]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1b:202b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396742, 'tstamp': 396742}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229969, 'error': None, 'target': 'ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:06:46 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.910 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[bd0fd2b8-085f-43af-9b18-716c0a59a7ef]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7939c8f9-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:20:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396742, 'reachable_time': 15416, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229970, 'error': None, 'target': 'ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:06:46 compute-1 nova_compute[226322]: 2026-01-26 10:06:46.912 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:06:46 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.938 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[57d4e07b-64ed-4309-a48f-511086149fba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:06:46 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.990 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[cc5b754e-3f6e-41d8-9345-f922754f7ea7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:06:46 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.991 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7939c8f9-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:06:46 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.992 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 10:06:46 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.992 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7939c8f9-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:06:46 compute-1 kernel: tap7939c8f9-00: entered promiscuous mode
Jan 26 10:06:46 compute-1 nova_compute[226322]: 2026-01-26 10:06:46.994 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:06:46 compute-1 nova_compute[226322]: 2026-01-26 10:06:46.996 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:06:46 compute-1 NetworkManager[49073]: <info>  [1769422006.9968] manager: (tap7939c8f9-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Jan 26 10:06:46 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.997 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7939c8f9-00, col_values=(('external_ids', {'iface-id': '87777a78-2fc8-466a-8f48-d1f2b973e0a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:06:46 compute-1 nova_compute[226322]: 2026-01-26 10:06:46.998 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:06:46 compute-1 ovn_controller[133670]: 2026-01-26T10:06:46Z|00031|binding|INFO|Releasing lport 87777a78-2fc8-466a-8f48-d1f2b973e0a9 from this chassis (sb_readonly=0)
Jan 26 10:06:47 compute-1 nova_compute[226322]: 2026-01-26 10:06:47.011 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:47.012 143326 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7939c8f9-060e-41e0-9e41-ebecf5f62dfb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7939c8f9-060e-41e0-9e41-ebecf5f62dfb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:47.013 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[89f19bcc-da54-4725-ad61-2f7b8a84206b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:47.016 143326 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]: global
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]:     log         /dev/log local0 debug
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]:     log-tag     haproxy-metadata-proxy-7939c8f9-060e-41e0-9e41-ebecf5f62dfb
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]:     user        root
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]:     group       root
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]:     maxconn     1024
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]:     pidfile     /var/lib/neutron/external/pids/7939c8f9-060e-41e0-9e41-ebecf5f62dfb.pid.haproxy
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]:     daemon
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]: 
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]: defaults
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]:     log global
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]:     mode http
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]:     option httplog
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]:     option dontlognull
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]:     option http-server-close
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]:     option forwardfor
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]:     retries                 3
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]:     timeout http-request    30s
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]:     timeout connect         30s
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]:     timeout client          32s
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]:     timeout server          32s
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]:     timeout http-keep-alive 30s
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]: 
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]: 
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]: listen listener
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]:     bind 169.254.169.254:80
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]:     http-request add-header X-OVN-Network-ID 7939c8f9-060e-41e0-9e41-ebecf5f62dfb
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 26 10:06:47 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:47.017 143326 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb', 'env', 'PROCESS_TAG=haproxy-7939c8f9-060e-41e0-9e41-ebecf5f62dfb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7939c8f9-060e-41e0-9e41-ebecf5f62dfb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 26 10:06:47 compute-1 podman[230003]: 2026-01-26 10:06:47.400883647 +0000 UTC m=+0.048158535 container create 66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 26 10:06:47 compute-1 systemd[1]: Started libpod-conmon-66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6.scope.
Jan 26 10:06:47 compute-1 systemd[1]: Started libcrun container.
Jan 26 10:06:47 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6308204caa6ef9ef44a497d3a0e6f90a3cb0448655da17487cc7918118dc819d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 10:06:47 compute-1 podman[230003]: 2026-01-26 10:06:47.376278111 +0000 UTC m=+0.023553019 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 26 10:06:47 compute-1 podman[230003]: 2026-01-26 10:06:47.485231183 +0000 UTC m=+0.132506071 container init 66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 26 10:06:47 compute-1 podman[230003]: 2026-01-26 10:06:47.491938246 +0000 UTC m=+0.139213124 container start 66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 10:06:47 compute-1 neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb[230019]: [NOTICE]   (230023) : New worker (230026) forked
Jan 26 10:06:47 compute-1 neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb[230019]: [NOTICE]   (230023) : Loading success.
Jan 26 10:06:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:47 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb30c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:47 compute-1 nova_compute[226322]: 2026-01-26 10:06:47.937 226326 DEBUG nova.compute.manager [req-d0e10174-ddc4-455a-94fb-0cfe24442373 req-43421b70-be97-4c90-acab-8088287865e6 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Received event network-changed-2a711799-8550-4432-83d1-93c9598eaa25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:06:47 compute-1 nova_compute[226322]: 2026-01-26 10:06:47.937 226326 DEBUG nova.compute.manager [req-d0e10174-ddc4-455a-94fb-0cfe24442373 req-43421b70-be97-4c90-acab-8088287865e6 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Refreshing instance network info cache due to event network-changed-2a711799-8550-4432-83d1-93c9598eaa25. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 10:06:47 compute-1 nova_compute[226322]: 2026-01-26 10:06:47.938 226326 DEBUG oslo_concurrency.lockutils [req-d0e10174-ddc4-455a-94fb-0cfe24442373 req-43421b70-be97-4c90-acab-8088287865e6 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 10:06:47 compute-1 nova_compute[226322]: 2026-01-26 10:06:47.938 226326 DEBUG oslo_concurrency.lockutils [req-d0e10174-ddc4-455a-94fb-0cfe24442373 req-43421b70-be97-4c90-acab-8088287865e6 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquired lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 10:06:47 compute-1 nova_compute[226322]: 2026-01-26 10:06:47.938 226326 DEBUG nova.network.neutron [req-d0e10174-ddc4-455a-94fb-0cfe24442373 req-43421b70-be97-4c90-acab-8088287865e6 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Refreshing network info cache for port 2a711799-8550-4432-83d1-93c9598eaa25 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 10:06:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:47 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:48 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:06:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:48 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100648 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 10:06:48 compute-1 ceph-mon[80107]: pgmap v737: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 2.1 MiB/s wr, 107 op/s
Jan 26 10:06:48 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:06:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999999s ======
Jan 26 10:06:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:48.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999999s
Jan 26 10:06:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:48.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:49 compute-1 nova_compute[226322]: 2026-01-26 10:06:49.119 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:06:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:49 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb3180034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:49 compute-1 nova_compute[226322]: 2026-01-26 10:06:49.824 226326 DEBUG nova.network.neutron [req-d0e10174-ddc4-455a-94fb-0cfe24442373 req-43421b70-be97-4c90-acab-8088287865e6 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Updated VIF entry in instance network info cache for port 2a711799-8550-4432-83d1-93c9598eaa25. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 10:06:49 compute-1 nova_compute[226322]: 2026-01-26 10:06:49.825 226326 DEBUG nova.network.neutron [req-d0e10174-ddc4-455a-94fb-0cfe24442373 req-43421b70-be97-4c90-acab-8088287865e6 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Updating instance_info_cache with network_info: [{"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:06:49 compute-1 nova_compute[226322]: 2026-01-26 10:06:49.841 226326 DEBUG oslo_concurrency.lockutils [req-d0e10174-ddc4-455a-94fb-0cfe24442373 req-43421b70-be97-4c90-acab-8088287865e6 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Releasing lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 10:06:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:49 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb30c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:50 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:50 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999999s ======
Jan 26 10:06:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:50.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999999s
Jan 26 10:06:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999999s ======
Jan 26 10:06:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:50.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999999s
Jan 26 10:06:50 compute-1 ceph-mon[80107]: pgmap v738: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 2.1 MiB/s wr, 107 op/s
Jan 26 10:06:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:51 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:51 compute-1 nova_compute[226322]: 2026-01-26 10:06:51.958 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:06:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:51 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb3180034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:52 compute-1 sshd-session[230036]: Invalid user ubuntu from 104.248.198.71 port 35400
Jan 26 10:06:52 compute-1 sshd-session[230036]: Connection closed by invalid user ubuntu 104.248.198.71 port 35400 [preauth]
Jan 26 10:06:52 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:52 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb30c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:52.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:52.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:53 compute-1 ceph-mon[80107]: pgmap v739: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 15 KiB/s wr, 88 op/s
Jan 26 10:06:53 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:06:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:53 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:53.928 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:06:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:53.929 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:06:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:06:53.930 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:06:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:53 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:54 compute-1 ceph-mon[80107]: pgmap v740: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 14 KiB/s wr, 85 op/s
Jan 26 10:06:54 compute-1 nova_compute[226322]: 2026-01-26 10:06:54.122 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:06:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:54 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb3180034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:54.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:54.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:55 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb30c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:55 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:55 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:56 compute-1 podman[230044]: 2026-01-26 10:06:56.306834439 +0000 UTC m=+0.089578523 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 26 10:06:56 compute-1 ceph-mon[80107]: pgmap v741: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 78 op/s
Jan 26 10:06:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:56 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:06:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:56.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:56.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:56 compute-1 nova_compute[226322]: 2026-01-26 10:06:56.960 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:06:57 compute-1 ovn_controller[133670]: 2026-01-26T10:06:57Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:30:cd:f5 10.100.0.8
Jan 26 10:06:57 compute-1 ovn_controller[133670]: 2026-01-26T10:06:57Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:30:cd:f5 10.100.0.8
Jan 26 10:06:57 compute-1 kernel: ganesha.nfsd[229507]: segfault at 50 ip 00007fb39e93f32e sp 00007fb349ffa210 error 4 in libntirpc.so.5.8[7fb39e924000+2c000] likely on CPU 2 (core 0, socket 2)
Jan 26 10:06:57 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 26 10:06:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:57 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb3180034e0 fd 38 proxy ignored for local
Jan 26 10:06:57 compute-1 systemd[1]: Started Process Core Dump (PID 230072/UID 0).
Jan 26 10:06:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 10:06:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2156652848' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:06:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 10:06:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2156652848' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:06:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:06:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:58.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:06:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:06:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:58.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:06:59 compute-1 ceph-mon[80107]: pgmap v742: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 391 KiB/s rd, 16 op/s
Jan 26 10:06:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/2156652848' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:06:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/2156652848' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:06:59 compute-1 systemd-coredump[230073]: Process 229180 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 42:
                                                    #0  0x00007fb39e93f32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Jan 26 10:06:59 compute-1 nova_compute[226322]: 2026-01-26 10:06:59.124 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:06:59 compute-1 systemd[1]: systemd-coredump@10-230072-0.service: Deactivated successfully.
Jan 26 10:06:59 compute-1 systemd[1]: systemd-coredump@10-230072-0.service: Consumed 1.149s CPU time.
Jan 26 10:06:59 compute-1 podman[230078]: 2026-01-26 10:06:59.281069933 +0000 UTC m=+0.031726659 container died 926ab88f4fc917023edfabab998a063a7c947db60c118100bca1f0cd441e3dca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_REF=squid, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 26 10:06:59 compute-1 systemd[1]: var-lib-containers-storage-overlay-6018c0b2554aae38b16b4fd14d15b48c1e664305ff0d647a5d04a7a440271808-merged.mount: Deactivated successfully.
Jan 26 10:06:59 compute-1 podman[230078]: 2026-01-26 10:06:59.322004703 +0000 UTC m=+0.072661449 container remove 926ab88f4fc917023edfabab998a063a7c947db60c118100bca1f0cd441e3dca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2)
Jan 26 10:06:59 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Main process exited, code=exited, status=139/n/a
Jan 26 10:06:59 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Failed with result 'exit-code'.
Jan 26 10:06:59 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.268s CPU time.
Jan 26 10:07:00 compute-1 ceph-mon[80107]: pgmap v743: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 391 KiB/s rd, 16 op/s
Jan 26 10:07:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:00.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:00.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:01 compute-1 nova_compute[226322]: 2026-01-26 10:07:01.962 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:02 compute-1 sudo[230124]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:07:02 compute-1 sudo[230124]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:07:02 compute-1 sudo[230124]: pam_unix(sudo:session): session closed for user root
Jan 26 10:07:02 compute-1 ceph-mon[80107]: pgmap v744: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 629 KiB/s rd, 2.1 MiB/s wr, 73 op/s
Jan 26 10:07:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999999s ======
Jan 26 10:07:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:02.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999999s
Jan 26 10:07:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:02.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:03 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:07:03 compute-1 nova_compute[226322]: 2026-01-26 10:07:03.503 226326 INFO nova.compute.manager [None req-c917550b-5647-41c7-968f-6a085006bfdc c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Get console output
Jan 26 10:07:03 compute-1 nova_compute[226322]: 2026-01-26 10:07:03.508 226326 INFO oslo.privsep.daemon [None req-c917550b-5647-41c7-968f-6a085006bfdc c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp2dcbnn_x/privsep.sock']
Jan 26 10:07:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100703 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 10:07:04 compute-1 nova_compute[226322]: 2026-01-26 10:07:04.128 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:04 compute-1 nova_compute[226322]: 2026-01-26 10:07:04.177 226326 INFO oslo.privsep.daemon [None req-c917550b-5647-41c7-968f-6a085006bfdc c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Spawned new privsep daemon via rootwrap
Jan 26 10:07:04 compute-1 nova_compute[226322]: 2026-01-26 10:07:04.045 230154 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 26 10:07:04 compute-1 nova_compute[226322]: 2026-01-26 10:07:04.051 230154 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 26 10:07:04 compute-1 nova_compute[226322]: 2026-01-26 10:07:04.055 230154 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 26 10:07:04 compute-1 nova_compute[226322]: 2026-01-26 10:07:04.055 230154 INFO oslo.privsep.daemon [-] privsep daemon running as pid 230154
Jan 26 10:07:04 compute-1 nova_compute[226322]: 2026-01-26 10:07:04.268 230154 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 10:07:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:04.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:04.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:04 compute-1 ceph-mon[80107]: pgmap v745: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 255 KiB/s rd, 2.1 MiB/s wr, 61 op/s
Jan 26 10:07:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:07:06 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:07:06.629 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 10:07:06 compute-1 nova_compute[226322]: 2026-01-26 10:07:06.629 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:06 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:07:06.630 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 10:07:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999998s ======
Jan 26 10:07:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:06.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999998s
Jan 26 10:07:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:06.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:06 compute-1 ceph-mon[80107]: pgmap v746: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 255 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Jan 26 10:07:06 compute-1 nova_compute[226322]: 2026-01-26 10:07:06.965 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:07 compute-1 ceph-mon[80107]: pgmap v747: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 239 KiB/s rd, 2.1 MiB/s wr, 58 op/s
Jan 26 10:07:08 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:07:08 compute-1 sshd-session[230158]: Connection closed by authenticating user root 152.42.136.110 port 43220 [preauth]
Jan 26 10:07:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999999s ======
Jan 26 10:07:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:08.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999999s
Jan 26 10:07:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:08.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:09 compute-1 nova_compute[226322]: 2026-01-26 10:07:09.128 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:09 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Scheduled restart job, restart counter is at 11.
Jan 26 10:07:09 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 10:07:09 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.268s CPU time.
Jan 26 10:07:09 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 10:07:09 compute-1 podman[230212]: 2026-01-26 10:07:09.980772449 +0000 UTC m=+0.060574542 container create 9c1ca928cfedd7dde65d34240d8625de067d4d53061cc47a3e852c486a704669 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True)
Jan 26 10:07:10 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e984c40ba02c1dc8b9e293440818da050517ac9c1b5775a04f860e5f572a1a4e/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 10:07:10 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e984c40ba02c1dc8b9e293440818da050517ac9c1b5775a04f860e5f572a1a4e/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 10:07:10 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e984c40ba02c1dc8b9e293440818da050517ac9c1b5775a04f860e5f572a1a4e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 10:07:10 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e984c40ba02c1dc8b9e293440818da050517ac9c1b5775a04f860e5f572a1a4e/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 10:07:10 compute-1 podman[230212]: 2026-01-26 10:07:10.037876652 +0000 UTC m=+0.117678795 container init 9c1ca928cfedd7dde65d34240d8625de067d4d53061cc47a3e852c486a704669 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Jan 26 10:07:10 compute-1 podman[230212]: 2026-01-26 10:07:10.042408038 +0000 UTC m=+0.122210141 container start 9c1ca928cfedd7dde65d34240d8625de067d4d53061cc47a3e852c486a704669 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 10:07:10 compute-1 bash[230212]: 9c1ca928cfedd7dde65d34240d8625de067d4d53061cc47a3e852c486a704669
Jan 26 10:07:10 compute-1 podman[230212]: 2026-01-26 10:07:09.956394232 +0000 UTC m=+0.036196355 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 10:07:10 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 10:07:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:10 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 10:07:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:10 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 10:07:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:10 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 26 10:07:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:10 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 26 10:07:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:10 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 26 10:07:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:10 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 26 10:07:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:10 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 26 10:07:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:10 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:07:10 compute-1 ceph-mon[80107]: pgmap v748: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 239 KiB/s rd, 2.1 MiB/s wr, 58 op/s
Jan 26 10:07:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:10.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:10.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:11 compute-1 nova_compute[226322]: 2026-01-26 10:07:11.969 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:12 compute-1 podman[230270]: 2026-01-26 10:07:12.281454231 +0000 UTC m=+0.051263580 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 10:07:12 compute-1 ceph-mon[80107]: pgmap v749: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 244 KiB/s rd, 2.1 MiB/s wr, 59 op/s
Jan 26 10:07:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999999s ======
Jan 26 10:07:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:12.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999999s
Jan 26 10:07:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999999s ======
Jan 26 10:07:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:12.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999999s
Jan 26 10:07:13 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:07:13 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:07:13.632 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:07:14 compute-1 nova_compute[226322]: 2026-01-26 10:07:14.129 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:14 compute-1 ceph-mon[80107]: pgmap v750: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 5.7 KiB/s rd, 16 KiB/s wr, 1 op/s
Jan 26 10:07:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001999997s ======
Jan 26 10:07:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:14.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001999997s
Jan 26 10:07:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999999s ======
Jan 26 10:07:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:14.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999999s
Jan 26 10:07:16 compute-1 ceph-mon[80107]: pgmap v751: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 7.0 KiB/s rd, 17 KiB/s wr, 3 op/s
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.080793) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422036080844, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 936, "num_deletes": 251, "total_data_size": 2039893, "memory_usage": 2058904, "flush_reason": "Manual Compaction"}
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422036147279, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 1342026, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25071, "largest_seqno": 26002, "table_properties": {"data_size": 1337658, "index_size": 2020, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 9870, "raw_average_key_size": 19, "raw_value_size": 1328751, "raw_average_value_size": 2678, "num_data_blocks": 89, "num_entries": 496, "num_filter_entries": 496, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769421968, "oldest_key_time": 1769421968, "file_creation_time": 1769422036, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 66523 microseconds, and 3981 cpu microseconds.
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.147321) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 1342026 bytes OK
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.147339) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.166150) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.166204) EVENT_LOG_v1 {"time_micros": 1769422036166196, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.166227) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 2035166, prev total WAL file size 2035447, number of live WAL files 2.
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.167132) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(1310KB)], [48(12MB)]
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422036167169, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 14796416, "oldest_snapshot_seqno": -1}
Jan 26 10:07:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:16 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Jan 26 10:07:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:16 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Jan 26 10:07:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:16 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:07:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:16 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:07:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:16 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 5370 keys, 12632842 bytes, temperature: kUnknown
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422036359222, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 12632842, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12597349, "index_size": 20983, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13445, "raw_key_size": 137884, "raw_average_key_size": 25, "raw_value_size": 12500421, "raw_average_value_size": 2327, "num_data_blocks": 852, "num_entries": 5370, "num_filter_entries": 5370, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769422036, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.359426) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 12632842 bytes
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.361085) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 77.0 rd, 65.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 12.8 +0.0 blob) out(12.0 +0.0 blob), read-write-amplify(20.4) write-amplify(9.4) OK, records in: 5890, records dropped: 520 output_compression: NoCompression
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.361100) EVENT_LOG_v1 {"time_micros": 1769422036361093, "job": 28, "event": "compaction_finished", "compaction_time_micros": 192112, "compaction_time_cpu_micros": 23774, "output_level": 6, "num_output_files": 1, "total_output_size": 12632842, "num_input_records": 5890, "num_output_records": 5370, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422036361785, "job": 28, "event": "table_file_deletion", "file_number": 50}
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422036363816, "job": 28, "event": "table_file_deletion", "file_number": 48}
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.166996) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.363912) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.363916) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.363918) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.363919) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:07:16 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.363921) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:07:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:16 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:07:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:16 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:07:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:16 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:07:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:16 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:07:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:16 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:07:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:16 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:07:16 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:16 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:07:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:16.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:16.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:16 compute-1 nova_compute[226322]: 2026-01-26 10:07:16.971 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:17 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/480017378' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:07:18 compute-1 ceph-mon[80107]: pgmap v752: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 6.5 KiB/s rd, 4.2 KiB/s wr, 2 op/s
Jan 26 10:07:18 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:07:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100718 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 10:07:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:18.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:18.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:19 compute-1 nova_compute[226322]: 2026-01-26 10:07:19.131 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:07:20 compute-1 ceph-mon[80107]: pgmap v753: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 6.5 KiB/s rd, 4.2 KiB/s wr, 2 op/s
Jan 26 10:07:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:20.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:20.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:21 compute-1 nova_compute[226322]: 2026-01-26 10:07:21.973 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:22 compute-1 sudo[230295]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:07:22 compute-1 sudo[230295]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:07:22 compute-1 sudo[230295]: pam_unix(sudo:session): session closed for user root
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000026:nfs.cephfs.0: -2
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 10:07:22 compute-1 ceph-mon[80107]: pgmap v754: 353 pgs: 353 active+clean; 167 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 31 op/s
Jan 26 10:07:22 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1886990296' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:07:22 compute-1 nova_compute[226322]: 2026-01-26 10:07:22.683 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:07:22 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe38c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:07:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:22.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:07:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:22.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:23 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:07:23 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2925681421' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:07:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:23 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:23 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe374000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:24 compute-1 nova_compute[226322]: 2026-01-26 10:07:24.177 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:24 compute-1 ceph-mon[80107]: pgmap v755: 353 pgs: 353 active+clean; 167 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 30 op/s
Jan 26 10:07:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:24 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe368000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:07:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:24.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:07:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:07:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:24.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:07:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100725 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 10:07:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:25 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe370000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:25 compute-1 nova_compute[226322]: 2026-01-26 10:07:25.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:07:25 compute-1 nova_compute[226322]: 2026-01-26 10:07:25.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:07:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:25 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:26 compute-1 ceph-mon[80107]: pgmap v756: 353 pgs: 353 active+clean; 167 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 39 op/s
Jan 26 10:07:26 compute-1 nova_compute[226322]: 2026-01-26 10:07:26.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:07:26 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:26 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe3740016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:26.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:26.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:26 compute-1 nova_compute[226322]: 2026-01-26 10:07:26.975 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:27 compute-1 sudo[230344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:07:27 compute-1 sudo[230344]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:07:27 compute-1 sudo[230344]: pam_unix(sudo:session): session closed for user root
Jan 26 10:07:27 compute-1 podman[230338]: 2026-01-26 10:07:27.291254261 +0000 UTC m=+0.076701648 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 26 10:07:27 compute-1 sudo[230386]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:07:27 compute-1 sudo[230386]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:07:27 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3153713550' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:07:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:27 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe3680016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:27 compute-1 nova_compute[226322]: 2026-01-26 10:07:27.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:07:27 compute-1 nova_compute[226322]: 2026-01-26 10:07:27.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:07:27 compute-1 nova_compute[226322]: 2026-01-26 10:07:27.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:07:27 compute-1 sudo[230386]: pam_unix(sudo:session): session closed for user root
Jan 26 10:07:27 compute-1 nova_compute[226322]: 2026-01-26 10:07:27.900 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 10:07:27 compute-1 nova_compute[226322]: 2026-01-26 10:07:27.901 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquired lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 10:07:27 compute-1 nova_compute[226322]: 2026-01-26 10:07:27.901 226326 DEBUG nova.network.neutron [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 26 10:07:27 compute-1 nova_compute[226322]: 2026-01-26 10:07:27.901 226326 DEBUG nova.objects.instance [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e89171d5-00d1-406a-bcc8-340bfdacbcbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 10:07:27 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:27 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe370001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:28 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:07:28 compute-1 ceph-mon[80107]: pgmap v757: 353 pgs: 353 active+clean; 167 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 37 op/s
Jan 26 10:07:28 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2994227211' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:07:28 compute-1 sshd-session[230435]: Connection closed by authenticating user root 178.62.249.31 port 50154 [preauth]
Jan 26 10:07:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:28 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:28.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:28.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:29 compute-1 nova_compute[226322]: 2026-01-26 10:07:29.179 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:29 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe3740016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:29 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe3680016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:30 compute-1 ceph-mon[80107]: pgmap v758: 353 pgs: 353 active+clean; 167 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 37 op/s
Jan 26 10:07:30 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:30 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe3680016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:30.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:07:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:30.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:07:31 compute-1 nova_compute[226322]: 2026-01-26 10:07:31.109 226326 DEBUG nova.network.neutron [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Updating instance_info_cache with network_info: [{"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:07:31 compute-1 nova_compute[226322]: 2026-01-26 10:07:31.128 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Releasing lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 10:07:31 compute-1 nova_compute[226322]: 2026-01-26 10:07:31.128 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 26 10:07:31 compute-1 nova_compute[226322]: 2026-01-26 10:07:31.129 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:07:31 compute-1 nova_compute[226322]: 2026-01-26 10:07:31.130 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:07:31 compute-1 nova_compute[226322]: 2026-01-26 10:07:31.130 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:07:31 compute-1 nova_compute[226322]: 2026-01-26 10:07:31.131 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:07:31 compute-1 nova_compute[226322]: 2026-01-26 10:07:31.132 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:07:31 compute-1 nova_compute[226322]: 2026-01-26 10:07:31.132 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:07:31 compute-1 nova_compute[226322]: 2026-01-26 10:07:31.150 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:07:31 compute-1 nova_compute[226322]: 2026-01-26 10:07:31.150 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:07:31 compute-1 nova_compute[226322]: 2026-01-26 10:07:31.150 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:07:31 compute-1 nova_compute[226322]: 2026-01-26 10:07:31.150 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:07:31 compute-1 nova_compute[226322]: 2026-01-26 10:07:31.151 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:07:31 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:07:31 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3033423951' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:07:31 compute-1 nova_compute[226322]: 2026-01-26 10:07:31.569 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:07:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:31 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:31 compute-1 nova_compute[226322]: 2026-01-26 10:07:31.768 226326 DEBUG nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Jan 26 10:07:31 compute-1 nova_compute[226322]: 2026-01-26 10:07:31.768 226326 DEBUG nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Jan 26 10:07:31 compute-1 nova_compute[226322]: 2026-01-26 10:07:31.934 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:07:31 compute-1 nova_compute[226322]: 2026-01-26 10:07:31.935 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4716MB free_disk=59.92194747924805GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:07:31 compute-1 nova_compute[226322]: 2026-01-26 10:07:31.936 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:07:31 compute-1 nova_compute[226322]: 2026-01-26 10:07:31.936 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:07:31 compute-1 nova_compute[226322]: 2026-01-26 10:07:31.979 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:31 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:31 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe3740016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:32 compute-1 nova_compute[226322]: 2026-01-26 10:07:32.016 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Instance e89171d5-00d1-406a-bcc8-340bfdacbcbc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 10:07:32 compute-1 nova_compute[226322]: 2026-01-26 10:07:32.016 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:07:32 compute-1 nova_compute[226322]: 2026-01-26 10:07:32.016 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:07:32 compute-1 nova_compute[226322]: 2026-01-26 10:07:32.056 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:07:32 compute-1 nova_compute[226322]: 2026-01-26 10:07:32.337 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:07:32 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/521770293' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:07:32 compute-1 nova_compute[226322]: 2026-01-26 10:07:32.477 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:07:32 compute-1 nova_compute[226322]: 2026-01-26 10:07:32.482 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating inventory in ProviderTree for provider d06842a0-5d13-4573-bb78-d433bbb380e4 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 10:07:32 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:07:32 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:07:32 compute-1 ceph-mon[80107]: pgmap v759: 353 pgs: 353 active+clean; 167 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 3.6 MiB/s rd, 1.8 MiB/s wr, 111 op/s
Jan 26 10:07:32 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3033423951' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:07:32 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:07:32 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:07:32 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3858921524' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:07:32 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:07:32 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:07:32 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:07:32 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:07:32 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:07:32 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/521770293' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:07:32 compute-1 nova_compute[226322]: 2026-01-26 10:07:32.733 226326 ERROR nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] [req-0d619a14-1168-4598-a18e-0c9a74f80743] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID d06842a0-5d13-4573-bb78-d433bbb380e4.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-0d619a14-1168-4598-a18e-0c9a74f80743"}]}
Jan 26 10:07:32 compute-1 nova_compute[226322]: 2026-01-26 10:07:32.750 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing inventories for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 26 10:07:32 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:32 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe3680016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:32 compute-1 nova_compute[226322]: 2026-01-26 10:07:32.779 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating ProviderTree inventory for provider d06842a0-5d13-4573-bb78-d433bbb380e4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 26 10:07:32 compute-1 nova_compute[226322]: 2026-01-26 10:07:32.779 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating inventory in ProviderTree for provider d06842a0-5d13-4573-bb78-d433bbb380e4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 10:07:32 compute-1 nova_compute[226322]: 2026-01-26 10:07:32.799 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing aggregate associations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 26 10:07:32 compute-1 nova_compute[226322]: 2026-01-26 10:07:32.820 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing trait associations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4, traits: HW_CPU_X86_CLMUL,HW_CPU_X86_SSE,HW_CPU_X86_AVX2,HW_CPU_X86_SHA,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_FMA3,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SVM,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 26 10:07:32 compute-1 nova_compute[226322]: 2026-01-26 10:07:32.852 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:07:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:07:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:32.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:07:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:32.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:33 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:07:33 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1049851893' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:07:33 compute-1 nova_compute[226322]: 2026-01-26 10:07:33.296 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:07:33 compute-1 nova_compute[226322]: 2026-01-26 10:07:33.301 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating inventory in ProviderTree for provider d06842a0-5d13-4573-bb78-d433bbb380e4 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 10:07:33 compute-1 nova_compute[226322]: 2026-01-26 10:07:33.348 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updated inventory for provider d06842a0-5d13-4573-bb78-d433bbb380e4 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Jan 26 10:07:33 compute-1 nova_compute[226322]: 2026-01-26 10:07:33.349 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating resource provider d06842a0-5d13-4573-bb78-d433bbb380e4 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 26 10:07:33 compute-1 nova_compute[226322]: 2026-01-26 10:07:33.349 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating inventory in ProviderTree for provider d06842a0-5d13-4573-bb78-d433bbb380e4 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 10:07:33 compute-1 nova_compute[226322]: 2026-01-26 10:07:33.378 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:07:33 compute-1 nova_compute[226322]: 2026-01-26 10:07:33.379 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.443s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:07:33 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:07:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:33 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe3700023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:33 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3350250843' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:07:33 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1049851893' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:07:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:07:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:33 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:34 compute-1 nova_compute[226322]: 2026-01-26 10:07:34.225 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:34 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe374002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100734 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 10:07:34 compute-1 ceph-mon[80107]: pgmap v760: 353 pgs: 353 active+clean; 167 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 15 KiB/s wr, 82 op/s
Jan 26 10:07:34 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2745198742' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:07:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:34.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:07:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:34.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:07:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:35 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe374002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:35 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:35 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe3700023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:36 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:36 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:36 compute-1 ceph-mon[80107]: pgmap v761: 353 pgs: 353 active+clean; 167 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 16 KiB/s wr, 83 op/s
Jan 26 10:07:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:36.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:36.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:36 compute-1 nova_compute[226322]: 2026-01-26 10:07:36.982 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:37 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:37 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:37 compute-1 ceph-mon[80107]: pgmap v762: 353 pgs: 353 active+clean; 167 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 16 KiB/s wr, 74 op/s
Jan 26 10:07:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:37 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe368002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:38 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:07:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:38 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe370002580 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:38 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Jan 26 10:07:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:38.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:38.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:39 compute-1 nova_compute[226322]: 2026-01-26 10:07:39.230 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:39 compute-1 sshd-session[230521]: Invalid user ubuntu from 104.248.198.71 port 36336
Jan 26 10:07:39 compute-1 sshd-session[230521]: Connection closed by invalid user ubuntu 104.248.198.71 port 36336 [preauth]
Jan 26 10:07:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:39 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe374002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:40 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:40 compute-1 sudo[230524]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:07:40 compute-1 sudo[230524]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:07:40 compute-1 sudo[230524]: pam_unix(sudo:session): session closed for user root
Jan 26 10:07:40 compute-1 ceph-mon[80107]: pgmap v763: 353 pgs: 353 active+clean; 167 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 16 KiB/s wr, 74 op/s
Jan 26 10:07:40 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:07:40 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:07:40 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:40 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe368002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:40.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:40.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:41 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:41 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe370003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:41 compute-1 nova_compute[226322]: 2026-01-26 10:07:41.984 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:42 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:42 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe374003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:42 compute-1 sudo[230550]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:07:42 compute-1 sudo[230550]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:07:42 compute-1 sudo[230550]: pam_unix(sudo:session): session closed for user root
Jan 26 10:07:42 compute-1 podman[230574]: 2026-01-26 10:07:42.633553724 +0000 UTC m=+0.055930654 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 10:07:42 compute-1 ceph-mon[80107]: pgmap v764: 353 pgs: 353 active+clean; 179 MiB data, 340 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.4 MiB/s wr, 93 op/s
Jan 26 10:07:42 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:42 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:07:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:42.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:07:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:07:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:42.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:07:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:43 : epoch 69773cce : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:07:43 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:07:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:43 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe368003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:44 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe370003730 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:44 compute-1 nova_compute[226322]: 2026-01-26 10:07:44.232 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:44 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe374003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:44.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:07:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:44.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:07:45 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:45 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384001c00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:46 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe368003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:46 : epoch 69773cce : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:07:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:46 : epoch 69773cce : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:07:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:46 : epoch 69773cce : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:07:46 compute-1 ceph-mon[80107]: pgmap v765: 353 pgs: 353 active+clean; 179 MiB data, 340 MiB used, 60 GiB / 60 GiB avail; 31 KiB/s rd, 1.4 MiB/s wr, 19 op/s
Jan 26 10:07:46 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:46 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe370003730 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:46.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:46.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:46 compute-1 nova_compute[226322]: 2026-01-26 10:07:46.986 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:47 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:47 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe374003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:47 compute-1 ceph-mon[80107]: pgmap v766: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 328 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Jan 26 10:07:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:48 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384001c00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:48 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:07:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:48 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384001c00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:48 compute-1 ceph-mon[80107]: pgmap v767: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Jan 26 10:07:48 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:07:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 10:07:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:48.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 10:07:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:48.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:49 : epoch 69773cce : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 10:07:49 compute-1 nova_compute[226322]: 2026-01-26 10:07:49.235 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:49 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe370003730 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:49 compute-1 sshd-session[230599]: Connection closed by authenticating user root 152.42.136.110 port 51012 [preauth]
Jan 26 10:07:50 compute-1 ceph-mon[80107]: pgmap v768: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Jan 26 10:07:50 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:50 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe374003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:50 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:50 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe368003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 10:07:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:50.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:50.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:51 compute-1 kernel: ganesha.nfsd[230323]: segfault at 50 ip 00007fe4163f932e sp 00007fe3c0ff8210 error 4 in libntirpc.so.5.8[7fe4163de000+2c000] likely on CPU 4 (core 0, socket 4)
Jan 26 10:07:51 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 26 10:07:51 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:51 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384003cc0 fd 48 proxy ignored for local
Jan 26 10:07:51 compute-1 systemd[1]: Started Process Core Dump (PID 230603/UID 0).
Jan 26 10:07:51 compute-1 nova_compute[226322]: 2026-01-26 10:07:51.902 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:51 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:07:51.901 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 10:07:51 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:07:51.903 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 10:07:52 compute-1 nova_compute[226322]: 2026-01-26 10:07:52.005 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:52 compute-1 ceph-mon[80107]: pgmap v769: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 329 KiB/s rd, 2.1 MiB/s wr, 67 op/s
Jan 26 10:07:52 compute-1 systemd-coredump[230604]: Process 230231 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 43:
                                                    #0  0x00007fe4163f932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Jan 26 10:07:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 10:07:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:52.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 10:07:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 10:07:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:52.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 10:07:52 compute-1 systemd[1]: systemd-coredump@11-230603-0.service: Deactivated successfully.
Jan 26 10:07:52 compute-1 systemd[1]: systemd-coredump@11-230603-0.service: Consumed 1.175s CPU time.
Jan 26 10:07:53 compute-1 podman[230610]: 2026-01-26 10:07:53.027847227 +0000 UTC m=+0.020040009 container died 9c1ca928cfedd7dde65d34240d8625de067d4d53061cc47a3e852c486a704669 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 10:07:53 compute-1 systemd[1]: var-lib-containers-storage-overlay-e984c40ba02c1dc8b9e293440818da050517ac9c1b5775a04f860e5f572a1a4e-merged.mount: Deactivated successfully.
Jan 26 10:07:53 compute-1 podman[230610]: 2026-01-26 10:07:53.060193223 +0000 UTC m=+0.052385975 container remove 9c1ca928cfedd7dde65d34240d8625de067d4d53061cc47a3e852c486a704669 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid)
Jan 26 10:07:53 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Main process exited, code=exited, status=139/n/a
Jan 26 10:07:53 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Failed with result 'exit-code'.
Jan 26 10:07:53 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.551s CPU time.
Jan 26 10:07:53 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:07:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:07:53.930 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:07:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:07:53.931 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:07:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:07:53.931 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:07:54 compute-1 nova_compute[226322]: 2026-01-26 10:07:54.237 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:54 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:07:54.905 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:07:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 10:07:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:54.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 10:07:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:54.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:55 compute-1 ceph-mon[80107]: pgmap v770: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 298 KiB/s rd, 795 KiB/s wr, 48 op/s
Jan 26 10:07:56 compute-1 ceph-mon[80107]: pgmap v771: 353 pgs: 353 active+clean; 121 MiB data, 308 MiB used, 60 GiB / 60 GiB avail; 317 KiB/s rd, 797 KiB/s wr, 76 op/s
Jan 26 10:07:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:56.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:56.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:57 compute-1 nova_compute[226322]: 2026-01-26 10:07:57.008 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:57 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/4177848907' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:07:57 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100757 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 10:07:58 compute-1 podman[230657]: 2026-01-26 10:07:58.294319228 +0000 UTC m=+0.075047488 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 26 10:07:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:07:58 compute-1 ceph-mon[80107]: pgmap v772: 353 pgs: 353 active+clean; 121 MiB data, 308 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 17 KiB/s wr, 30 op/s
Jan 26 10:07:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/834116444' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:07:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/834116444' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:07:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:58.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:07:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:07:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:58.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:07:59 compute-1 nova_compute[226322]: 2026-01-26 10:07:59.278 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:07:59 compute-1 ovn_controller[133670]: 2026-01-26T10:07:59Z|00032|binding|INFO|Releasing lport 87777a78-2fc8-466a-8f48-d1f2b973e0a9 from this chassis (sb_readonly=0)
Jan 26 10:08:00 compute-1 nova_compute[226322]: 2026-01-26 10:08:00.031 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:00 compute-1 ceph-mon[80107]: pgmap v773: 353 pgs: 353 active+clean; 121 MiB data, 308 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 17 KiB/s wr, 30 op/s
Jan 26 10:08:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:00.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 10:08:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:00.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.347 226326 DEBUG nova.compute.manager [req-c639b0af-8c3a-471c-800e-2e573c44ac69 req-b14e9264-6dc7-4960-978c-5dff71c0282b b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Received event network-changed-2a711799-8550-4432-83d1-93c9598eaa25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.347 226326 DEBUG nova.compute.manager [req-c639b0af-8c3a-471c-800e-2e573c44ac69 req-b14e9264-6dc7-4960-978c-5dff71c0282b b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Refreshing instance network info cache due to event network-changed-2a711799-8550-4432-83d1-93c9598eaa25. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.347 226326 DEBUG oslo_concurrency.lockutils [req-c639b0af-8c3a-471c-800e-2e573c44ac69 req-b14e9264-6dc7-4960-978c-5dff71c0282b b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.348 226326 DEBUG oslo_concurrency.lockutils [req-c639b0af-8c3a-471c-800e-2e573c44ac69 req-b14e9264-6dc7-4960-978c-5dff71c0282b b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquired lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.348 226326 DEBUG nova.network.neutron [req-c639b0af-8c3a-471c-800e-2e573c44ac69 req-b14e9264-6dc7-4960-978c-5dff71c0282b b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Refreshing network info cache for port 2a711799-8550-4432-83d1-93c9598eaa25 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.458 226326 DEBUG oslo_concurrency.lockutils [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.459 226326 DEBUG oslo_concurrency.lockutils [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.460 226326 DEBUG oslo_concurrency.lockutils [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.460 226326 DEBUG oslo_concurrency.lockutils [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.461 226326 DEBUG oslo_concurrency.lockutils [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.463 226326 INFO nova.compute.manager [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Terminating instance
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.465 226326 DEBUG nova.compute.manager [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 10:08:01 compute-1 kernel: tap2a711799-85 (unregistering): left promiscuous mode
Jan 26 10:08:01 compute-1 NetworkManager[49073]: <info>  [1769422081.5633] device (tap2a711799-85): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 10:08:01 compute-1 ovn_controller[133670]: 2026-01-26T10:08:01Z|00033|binding|INFO|Releasing lport 2a711799-8550-4432-83d1-93c9598eaa25 from this chassis (sb_readonly=0)
Jan 26 10:08:01 compute-1 ovn_controller[133670]: 2026-01-26T10:08:01Z|00034|binding|INFO|Setting lport 2a711799-8550-4432-83d1-93c9598eaa25 down in Southbound
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.574 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:01 compute-1 ovn_controller[133670]: 2026-01-26T10:08:01Z|00035|binding|INFO|Removing iface tap2a711799-85 ovn-installed in OVS
Jan 26 10:08:01 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.585 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:cd:f5 10.100.0.8'], port_security=['fa:16:3e:30:cd:f5 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'e89171d5-00d1-406a-bcc8-340bfdacbcbc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7939c8f9-060e-41e0-9e41-ebecf5f62dfb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a9f16b9a-82ea-4f86-8e79-c292f5efcb52', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1aeef230-e621-407c-aaae-8f0a628ce92a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], logical_port=2a711799-8550-4432-83d1-93c9598eaa25) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 10:08:01 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.586 143326 INFO neutron.agent.ovn.metadata.agent [-] Port 2a711799-8550-4432-83d1-93c9598eaa25 in datapath 7939c8f9-060e-41e0-9e41-ebecf5f62dfb unbound from our chassis
Jan 26 10:08:01 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.588 143326 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7939c8f9-060e-41e0-9e41-ebecf5f62dfb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 10:08:01 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.589 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[afcec71b-a998-4226-bd3c-33d5e83d8758]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:08:01 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.589 143326 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb namespace which is not needed anymore
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.596 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:01 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Jan 26 10:08:01 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 16.458s CPU time.
Jan 26 10:08:01 compute-1 systemd-machined[194876]: Machine qemu-1-instance-00000001 terminated.
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.700 226326 INFO nova.virt.libvirt.driver [-] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Instance destroyed successfully.
Jan 26 10:08:01 compute-1 neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb[230019]: [NOTICE]   (230023) : haproxy version is 2.8.14-c23fe91
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.701 226326 DEBUG nova.objects.instance [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'resources' on Instance uuid e89171d5-00d1-406a-bcc8-340bfdacbcbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 10:08:01 compute-1 neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb[230019]: [NOTICE]   (230023) : path to executable is /usr/sbin/haproxy
Jan 26 10:08:01 compute-1 neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb[230019]: [WARNING]  (230023) : Exiting Master process...
Jan 26 10:08:01 compute-1 neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb[230019]: [ALERT]    (230023) : Current worker (230026) exited with code 143 (Terminated)
Jan 26 10:08:01 compute-1 neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb[230019]: [WARNING]  (230023) : All workers exited. Exiting... (0)
Jan 26 10:08:01 compute-1 systemd[1]: libpod-66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6.scope: Deactivated successfully.
Jan 26 10:08:01 compute-1 podman[230709]: 2026-01-26 10:08:01.71308444 +0000 UTC m=+0.040047167 container died 66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.729 226326 DEBUG nova.virt.libvirt.vif [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T10:06:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1457730655',display_name='tempest-TestNetworkBasicOps-server-1457730655',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1457730655',id=1,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCzAq4PU9BtYDVE2893QINCbdmYXSOCaAokqiTddkRve8RypnqtELDxIf91MHtzezEtPVmgAZuG9SeAnge4WS1FhmBBt7X/P4oQbB5mob/LapxwgpslluUHdSteOPotImA==',key_name='tempest-TestNetworkBasicOps-1466537094',keypairs=<?>,launch_index=0,launched_at=2026-01-26T10:06:43Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-snk42cm3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T10:06:43Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=e89171d5-00d1-406a-bcc8-340bfdacbcbc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.730 226326 DEBUG nova.network.os_vif_util [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.731 226326 DEBUG nova.network.os_vif_util [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:30:cd:f5,bridge_name='br-int',has_traffic_filtering=True,id=2a711799-8550-4432-83d1-93c9598eaa25,network=Network(7939c8f9-060e-41e0-9e41-ebecf5f62dfb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a711799-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.731 226326 DEBUG os_vif [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:cd:f5,bridge_name='br-int',has_traffic_filtering=True,id=2a711799-8550-4432-83d1-93c9598eaa25,network=Network(7939c8f9-060e-41e0-9e41-ebecf5f62dfb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a711799-85') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.733 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.733 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2a711799-85, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:08:01 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6-userdata-shm.mount: Deactivated successfully.
Jan 26 10:08:01 compute-1 systemd[1]: var-lib-containers-storage-overlay-6308204caa6ef9ef44a497d3a0e6f90a3cb0448655da17487cc7918118dc819d-merged.mount: Deactivated successfully.
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.768 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.769 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.772 226326 INFO os_vif [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:cd:f5,bridge_name='br-int',has_traffic_filtering=True,id=2a711799-8550-4432-83d1-93c9598eaa25,network=Network(7939c8f9-060e-41e0-9e41-ebecf5f62dfb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a711799-85')
Jan 26 10:08:01 compute-1 podman[230709]: 2026-01-26 10:08:01.773988656 +0000 UTC m=+0.100951383 container cleanup 66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 26 10:08:01 compute-1 systemd[1]: libpod-conmon-66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6.scope: Deactivated successfully.
Jan 26 10:08:01 compute-1 podman[230755]: 2026-01-26 10:08:01.830881352 +0000 UTC m=+0.037955886 container remove 66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 26 10:08:01 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.836 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[924f91d8-05a9-4eb6-93bc-4e3242e50629]: (4, ('Mon Jan 26 10:08:01 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb (66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6)\n66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6\nMon Jan 26 10:08:01 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb (66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6)\n66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:08:01 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.837 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[ef493ef8-457c-4175-a88e-a4142ebb63ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:08:01 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.839 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7939c8f9-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:08:01 compute-1 kernel: tap7939c8f9-00: left promiscuous mode
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.841 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:01 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.844 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[815448fc-659f-4cd7-9378-a13e1e1e8dd6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:08:01 compute-1 nova_compute[226322]: 2026-01-26 10:08:01.854 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:01 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.862 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[c043c093-41fb-4aa9-9c14-92500e49ade5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:08:01 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.863 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[179b3520-6e16-4652-8c9d-941170dda6ef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:08:01 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.874 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[0aa69c9d-ee54-4e78-997f-4b2d855a2a67]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396734, 'reachable_time': 32982, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230784, 'error': None, 'target': 'ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:08:01 compute-1 systemd[1]: run-netns-ovnmeta\x2d7939c8f9\x2d060e\x2d41e0\x2d9e41\x2debecf5f62dfb.mount: Deactivated successfully.
Jan 26 10:08:01 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.883 143615 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 26 10:08:01 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.884 143615 DEBUG oslo.privsep.daemon [-] privsep: reply[7d080ee7-3578-4f18-b6f0-7ea0d369fe46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:08:02 compute-1 sudo[230786]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:08:02 compute-1 sudo[230786]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:08:02 compute-1 sudo[230786]: pam_unix(sudo:session): session closed for user root
Jan 26 10:08:02 compute-1 nova_compute[226322]: 2026-01-26 10:08:02.780 226326 DEBUG nova.network.neutron [req-c639b0af-8c3a-471c-800e-2e573c44ac69 req-b14e9264-6dc7-4960-978c-5dff71c0282b b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Updated VIF entry in instance network info cache for port 2a711799-8550-4432-83d1-93c9598eaa25. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 10:08:02 compute-1 nova_compute[226322]: 2026-01-26 10:08:02.781 226326 DEBUG nova.network.neutron [req-c639b0af-8c3a-471c-800e-2e573c44ac69 req-b14e9264-6dc7-4960-978c-5dff71c0282b b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Updating instance_info_cache with network_info: [{"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:08:02 compute-1 ceph-mon[80107]: pgmap v774: 353 pgs: 353 active+clean; 121 MiB data, 308 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 26 KiB/s wr, 31 op/s
Jan 26 10:08:02 compute-1 nova_compute[226322]: 2026-01-26 10:08:02.826 226326 DEBUG oslo_concurrency.lockutils [req-c639b0af-8c3a-471c-800e-2e573c44ac69 req-b14e9264-6dc7-4960-978c-5dff71c0282b b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Releasing lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 10:08:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:02.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:02.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:03 compute-1 nova_compute[226322]: 2026-01-26 10:08:03.153 226326 INFO nova.virt.libvirt.driver [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Deleting instance files /var/lib/nova/instances/e89171d5-00d1-406a-bcc8-340bfdacbcbc_del
Jan 26 10:08:03 compute-1 nova_compute[226322]: 2026-01-26 10:08:03.154 226326 INFO nova.virt.libvirt.driver [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Deletion of /var/lib/nova/instances/e89171d5-00d1-406a-bcc8-340bfdacbcbc_del complete
Jan 26 10:08:03 compute-1 nova_compute[226322]: 2026-01-26 10:08:03.219 226326 DEBUG nova.virt.libvirt.host [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Jan 26 10:08:03 compute-1 nova_compute[226322]: 2026-01-26 10:08:03.220 226326 INFO nova.virt.libvirt.host [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] UEFI support detected
Jan 26 10:08:03 compute-1 nova_compute[226322]: 2026-01-26 10:08:03.222 226326 INFO nova.compute.manager [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Took 1.76 seconds to destroy the instance on the hypervisor.
Jan 26 10:08:03 compute-1 nova_compute[226322]: 2026-01-26 10:08:03.222 226326 DEBUG oslo.service.loopingcall [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 26 10:08:03 compute-1 nova_compute[226322]: 2026-01-26 10:08:03.222 226326 DEBUG nova.compute.manager [-] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 10:08:03 compute-1 nova_compute[226322]: 2026-01-26 10:08:03.222 226326 DEBUG nova.network.neutron [-] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 10:08:03 compute-1 nova_compute[226322]: 2026-01-26 10:08:03.428 226326 DEBUG nova.compute.manager [req-0cec8d4c-0146-4df3-bfd7-1d9b2af7fb20 req-8e3e9bf0-ba6d-4cce-919a-b1b921c69372 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Received event network-vif-unplugged-2a711799-8550-4432-83d1-93c9598eaa25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:08:03 compute-1 nova_compute[226322]: 2026-01-26 10:08:03.428 226326 DEBUG oslo_concurrency.lockutils [req-0cec8d4c-0146-4df3-bfd7-1d9b2af7fb20 req-8e3e9bf0-ba6d-4cce-919a-b1b921c69372 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:08:03 compute-1 nova_compute[226322]: 2026-01-26 10:08:03.428 226326 DEBUG oslo_concurrency.lockutils [req-0cec8d4c-0146-4df3-bfd7-1d9b2af7fb20 req-8e3e9bf0-ba6d-4cce-919a-b1b921c69372 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:08:03 compute-1 nova_compute[226322]: 2026-01-26 10:08:03.429 226326 DEBUG oslo_concurrency.lockutils [req-0cec8d4c-0146-4df3-bfd7-1d9b2af7fb20 req-8e3e9bf0-ba6d-4cce-919a-b1b921c69372 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:08:03 compute-1 nova_compute[226322]: 2026-01-26 10:08:03.429 226326 DEBUG nova.compute.manager [req-0cec8d4c-0146-4df3-bfd7-1d9b2af7fb20 req-8e3e9bf0-ba6d-4cce-919a-b1b921c69372 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] No waiting events found dispatching network-vif-unplugged-2a711799-8550-4432-83d1-93c9598eaa25 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 10:08:03 compute-1 nova_compute[226322]: 2026-01-26 10:08:03.429 226326 DEBUG nova.compute.manager [req-0cec8d4c-0146-4df3-bfd7-1d9b2af7fb20 req-8e3e9bf0-ba6d-4cce-919a-b1b921c69372 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Received event network-vif-unplugged-2a711799-8550-4432-83d1-93c9598eaa25 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 26 10:08:03 compute-1 nova_compute[226322]: 2026-01-26 10:08:03.430 226326 DEBUG nova.compute.manager [req-0cec8d4c-0146-4df3-bfd7-1d9b2af7fb20 req-8e3e9bf0-ba6d-4cce-919a-b1b921c69372 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Received event network-vif-plugged-2a711799-8550-4432-83d1-93c9598eaa25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:08:03 compute-1 nova_compute[226322]: 2026-01-26 10:08:03.430 226326 DEBUG oslo_concurrency.lockutils [req-0cec8d4c-0146-4df3-bfd7-1d9b2af7fb20 req-8e3e9bf0-ba6d-4cce-919a-b1b921c69372 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:08:03 compute-1 nova_compute[226322]: 2026-01-26 10:08:03.430 226326 DEBUG oslo_concurrency.lockutils [req-0cec8d4c-0146-4df3-bfd7-1d9b2af7fb20 req-8e3e9bf0-ba6d-4cce-919a-b1b921c69372 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:08:03 compute-1 nova_compute[226322]: 2026-01-26 10:08:03.430 226326 DEBUG oslo_concurrency.lockutils [req-0cec8d4c-0146-4df3-bfd7-1d9b2af7fb20 req-8e3e9bf0-ba6d-4cce-919a-b1b921c69372 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:08:03 compute-1 nova_compute[226322]: 2026-01-26 10:08:03.431 226326 DEBUG nova.compute.manager [req-0cec8d4c-0146-4df3-bfd7-1d9b2af7fb20 req-8e3e9bf0-ba6d-4cce-919a-b1b921c69372 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] No waiting events found dispatching network-vif-plugged-2a711799-8550-4432-83d1-93c9598eaa25 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 10:08:03 compute-1 nova_compute[226322]: 2026-01-26 10:08:03.431 226326 WARNING nova.compute.manager [req-0cec8d4c-0146-4df3-bfd7-1d9b2af7fb20 req-8e3e9bf0-ba6d-4cce-919a-b1b921c69372 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Received unexpected event network-vif-plugged-2a711799-8550-4432-83d1-93c9598eaa25 for instance with vm_state active and task_state deleting.
Jan 26 10:08:03 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Scheduled restart job, restart counter is at 12.
Jan 26 10:08:03 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 10:08:03 compute-1 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.551s CPU time.
Jan 26 10:08:03 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 10:08:03 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:08:03 compute-1 podman[230864]: 2026-01-26 10:08:03.767589324 +0000 UTC m=+0.053892521 container create 2481bf60245d7c2e187aaa8c0a64c69d854b50d0db3801811d86e7df60b7c5b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 10:08:03 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f886142fbe5c7c4ceb50c688ee912036a0bd37edf44bca34f52bdc51ce28cbf9/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 10:08:03 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f886142fbe5c7c4ceb50c688ee912036a0bd37edf44bca34f52bdc51ce28cbf9/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 10:08:03 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f886142fbe5c7c4ceb50c688ee912036a0bd37edf44bca34f52bdc51ce28cbf9/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 10:08:03 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f886142fbe5c7c4ceb50c688ee912036a0bd37edf44bca34f52bdc51ce28cbf9/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 10:08:03 compute-1 podman[230864]: 2026-01-26 10:08:03.742060029 +0000 UTC m=+0.028363266 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 10:08:03 compute-1 podman[230864]: 2026-01-26 10:08:03.834750056 +0000 UTC m=+0.121053243 container init 2481bf60245d7c2e187aaa8c0a64c69d854b50d0db3801811d86e7df60b7c5b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 10:08:03 compute-1 podman[230864]: 2026-01-26 10:08:03.848666972 +0000 UTC m=+0.134970149 container start 2481bf60245d7c2e187aaa8c0a64c69d854b50d0db3801811d86e7df60b7c5b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Jan 26 10:08:03 compute-1 bash[230864]: 2481bf60245d7c2e187aaa8c0a64c69d854b50d0db3801811d86e7df60b7c5b3
Jan 26 10:08:03 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 10:08:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 10:08:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 10:08:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:08:03 compute-1 nova_compute[226322]: 2026-01-26 10:08:03.970 226326 DEBUG nova.network.neutron [-] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:08:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 26 10:08:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 26 10:08:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 26 10:08:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 26 10:08:03 compute-1 nova_compute[226322]: 2026-01-26 10:08:03.990 226326 INFO nova.compute.manager [-] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Took 0.77 seconds to deallocate network for instance.
Jan 26 10:08:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 26 10:08:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:04 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:08:04 compute-1 nova_compute[226322]: 2026-01-26 10:08:04.064 226326 DEBUG oslo_concurrency.lockutils [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:08:04 compute-1 nova_compute[226322]: 2026-01-26 10:08:04.065 226326 DEBUG oslo_concurrency.lockutils [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:08:04 compute-1 nova_compute[226322]: 2026-01-26 10:08:04.121 226326 DEBUG oslo_concurrency.processutils [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:08:04 compute-1 nova_compute[226322]: 2026-01-26 10:08:04.150 226326 DEBUG nova.compute.manager [req-31d93124-1aea-483d-9968-14190943187e req-5361482f-372d-4dab-8d55-d7a25f53c306 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Received event network-vif-deleted-2a711799-8550-4432-83d1-93c9598eaa25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:08:04 compute-1 nova_compute[226322]: 2026-01-26 10:08:04.320 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:04 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:08:04 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3400737954' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:08:04 compute-1 nova_compute[226322]: 2026-01-26 10:08:04.640 226326 DEBUG oslo_concurrency.processutils [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.518s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:08:04 compute-1 nova_compute[226322]: 2026-01-26 10:08:04.645 226326 DEBUG nova.compute.provider_tree [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:08:04 compute-1 nova_compute[226322]: 2026-01-26 10:08:04.683 226326 DEBUG nova.scheduler.client.report [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:08:04 compute-1 nova_compute[226322]: 2026-01-26 10:08:04.704 226326 DEBUG oslo_concurrency.lockutils [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.640s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:08:04 compute-1 nova_compute[226322]: 2026-01-26 10:08:04.729 226326 INFO nova.scheduler.client.report [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Deleted allocations for instance e89171d5-00d1-406a-bcc8-340bfdacbcbc
Jan 26 10:08:04 compute-1 nova_compute[226322]: 2026-01-26 10:08:04.916 226326 DEBUG oslo_concurrency.lockutils [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.457s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:08:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:04.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:04.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:04 compute-1 ceph-mon[80107]: pgmap v775: 353 pgs: 353 active+clean; 121 MiB data, 308 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 11 KiB/s wr, 29 op/s
Jan 26 10:08:04 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3400737954' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:08:06 compute-1 ceph-mon[80107]: pgmap v776: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 38 KiB/s rd, 12 KiB/s wr, 57 op/s
Jan 26 10:08:06 compute-1 nova_compute[226322]: 2026-01-26 10:08:06.857 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:06.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 10:08:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:06.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 10:08:08 compute-1 nova_compute[226322]: 2026-01-26 10:08:08.289 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:08 compute-1 nova_compute[226322]: 2026-01-26 10:08:08.367 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:08 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:08:08 compute-1 ceph-mon[80107]: pgmap v777: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 10 KiB/s wr, 29 op/s
Jan 26 10:08:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:08.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 10:08:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:08.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 10:08:09 compute-1 nova_compute[226322]: 2026-01-26 10:08:09.322 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:10 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:08:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:10 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:08:10 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:10 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:08:10 compute-1 ceph-mon[80107]: pgmap v778: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 10 KiB/s wr, 29 op/s
Jan 26 10:08:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:10.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:10.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:11 compute-1 nova_compute[226322]: 2026-01-26 10:08:11.859 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:12 compute-1 ceph-mon[80107]: pgmap v779: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 11 KiB/s wr, 30 op/s
Jan 26 10:08:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 10:08:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:12.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 10:08:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:12.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:13 compute-1 podman[230950]: 2026-01-26 10:08:13.271870441 +0000 UTC m=+0.051585424 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 26 10:08:13 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:08:13 compute-1 ceph-mon[80107]: pgmap v780: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.6 KiB/s wr, 28 op/s
Jan 26 10:08:14 compute-1 sshd-session[230949]: Connection closed by authenticating user root 178.62.249.31 port 48258 [preauth]
Jan 26 10:08:14 compute-1 nova_compute[226322]: 2026-01-26 10:08:14.324 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 10:08:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:14.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 10:08:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:14.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:14 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:08:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:14 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:08:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:14 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:08:15 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:15 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:08:16 compute-1 ceph-mon[80107]: pgmap v781: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.6 KiB/s wr, 29 op/s
Jan 26 10:08:16 compute-1 nova_compute[226322]: 2026-01-26 10:08:16.700 226326 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769422081.6982865, e89171d5-00d1-406a-bcc8-340bfdacbcbc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 10:08:16 compute-1 nova_compute[226322]: 2026-01-26 10:08:16.700 226326 INFO nova.compute.manager [-] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] VM Stopped (Lifecycle Event)
Jan 26 10:08:16 compute-1 nova_compute[226322]: 2026-01-26 10:08:16.924 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:16 compute-1 nova_compute[226322]: 2026-01-26 10:08:16.960 226326 DEBUG nova.compute.manager [None req-afcc0dc7-3569-4c31-8d19-3d67fd7250ff - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 10:08:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:16.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:17.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:18 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:08:18 compute-1 ceph-mon[80107]: pgmap v782: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 341 B/s wr, 1 op/s
Jan 26 10:08:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:18.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:19.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:19 compute-1 nova_compute[226322]: 2026-01-26 10:08:19.380 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:08:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:08:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:08:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:08:20 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:20 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:08:20 compute-1 ceph-mon[80107]: pgmap v783: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 341 B/s wr, 1 op/s
Jan 26 10:08:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:20.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 10:08:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:21.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 10:08:21 compute-1 nova_compute[226322]: 2026-01-26 10:08:21.925 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:21 compute-1 ceph-mon[80107]: pgmap v784: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 341 B/s wr, 1 op/s
Jan 26 10:08:22 compute-1 sudo[230976]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:08:22 compute-1 sudo[230976]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:08:22 compute-1 sudo[230976]: pam_unix(sudo:session): session closed for user root
Jan 26 10:08:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 10:08:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:22.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 10:08:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 10:08:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:23.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 10:08:23 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:08:24 compute-1 nova_compute[226322]: 2026-01-26 10:08:24.381 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:24 compute-1 ceph-mon[80107]: pgmap v785: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 26 10:08:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 10:08:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:24.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 10:08:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:08:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:08:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:08:25 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:08:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:25.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:26 compute-1 ceph-mon[80107]: pgmap v786: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:08:26 compute-1 nova_compute[226322]: 2026-01-26 10:08:26.986 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:26.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:27.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:27 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1542247195' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:08:27 compute-1 sshd-session[231004]: Invalid user centos from 104.248.198.71 port 57124
Jan 26 10:08:27 compute-1 sshd-session[231004]: Connection closed by invalid user centos 104.248.198.71 port 57124 [preauth]
Jan 26 10:08:28 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:08:28 compute-1 ceph-mon[80107]: pgmap v787: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 0 op/s
Jan 26 10:08:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:29.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:29 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:08:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:29 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:08:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:29 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:08:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:29 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:08:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 10:08:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:29.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 10:08:29 compute-1 podman[231006]: 2026-01-26 10:08:29.351592762 +0000 UTC m=+0.127440813 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 10:08:29 compute-1 nova_compute[226322]: 2026-01-26 10:08:29.383 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:29 compute-1 nova_compute[226322]: 2026-01-26 10:08:29.933 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:08:29 compute-1 nova_compute[226322]: 2026-01-26 10:08:29.934 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:08:29 compute-1 nova_compute[226322]: 2026-01-26 10:08:29.934 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:08:29 compute-1 nova_compute[226322]: 2026-01-26 10:08:29.934 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:08:30 compute-1 nova_compute[226322]: 2026-01-26 10:08:30.088 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 10:08:30 compute-1 nova_compute[226322]: 2026-01-26 10:08:30.089 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:08:30 compute-1 nova_compute[226322]: 2026-01-26 10:08:30.090 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:08:30 compute-1 nova_compute[226322]: 2026-01-26 10:08:30.090 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:08:30 compute-1 nova_compute[226322]: 2026-01-26 10:08:30.091 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:08:30 compute-1 nova_compute[226322]: 2026-01-26 10:08:30.091 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:08:30 compute-1 nova_compute[226322]: 2026-01-26 10:08:30.091 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:08:30 compute-1 nova_compute[226322]: 2026-01-26 10:08:30.112 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:08:30 compute-1 nova_compute[226322]: 2026-01-26 10:08:30.113 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:08:30 compute-1 nova_compute[226322]: 2026-01-26 10:08:30.113 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:08:30 compute-1 nova_compute[226322]: 2026-01-26 10:08:30.113 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:08:30 compute-1 nova_compute[226322]: 2026-01-26 10:08:30.113 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:08:30 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:08:30 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2323226978' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:08:30 compute-1 nova_compute[226322]: 2026-01-26 10:08:30.567 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:08:30 compute-1 nova_compute[226322]: 2026-01-26 10:08:30.717 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:08:30 compute-1 nova_compute[226322]: 2026-01-26 10:08:30.718 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4932MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:08:30 compute-1 nova_compute[226322]: 2026-01-26 10:08:30.718 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:08:30 compute-1 nova_compute[226322]: 2026-01-26 10:08:30.718 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:08:30 compute-1 nova_compute[226322]: 2026-01-26 10:08:30.797 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:08:30 compute-1 nova_compute[226322]: 2026-01-26 10:08:30.797 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:08:30 compute-1 nova_compute[226322]: 2026-01-26 10:08:30.813 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:08:30 compute-1 ceph-mon[80107]: pgmap v788: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 0 op/s
Jan 26 10:08:30 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2323226978' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:08:30 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3734299171' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:08:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:31.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:31.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:31 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:08:31 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2213818827' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:08:31 compute-1 nova_compute[226322]: 2026-01-26 10:08:31.288 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:08:31 compute-1 nova_compute[226322]: 2026-01-26 10:08:31.294 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:08:31 compute-1 nova_compute[226322]: 2026-01-26 10:08:31.308 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:08:31 compute-1 nova_compute[226322]: 2026-01-26 10:08:31.327 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:08:31 compute-1 nova_compute[226322]: 2026-01-26 10:08:31.327 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.609s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:08:31 compute-1 sshd-session[231052]: Connection closed by authenticating user root 152.42.136.110 port 35960 [preauth]
Jan 26 10:08:31 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2213818827' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:08:31 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1950693756' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:08:31 compute-1 nova_compute[226322]: 2026-01-26 10:08:31.925 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:08:31 compute-1 nova_compute[226322]: 2026-01-26 10:08:31.925 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:08:31 compute-1 nova_compute[226322]: 2026-01-26 10:08:31.989 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:32 compute-1 ceph-mon[80107]: pgmap v789: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:08:32 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2279715829' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:08:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 10:08:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:33.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 10:08:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:33.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:33 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:08:33 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2515291421' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:08:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:08:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:08:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:08:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:08:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:34 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:08:34 compute-1 nova_compute[226322]: 2026-01-26 10:08:34.386 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:34 compute-1 ceph-mon[80107]: pgmap v790: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 0 op/s
Jan 26 10:08:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:35.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:35.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:35 compute-1 ceph-mon[80107]: pgmap v791: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 35 op/s
Jan 26 10:08:36 compute-1 nova_compute[226322]: 2026-01-26 10:08:36.992 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:37 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1755361075' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:08:37 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2214757130' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:08:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:37.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 10:08:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:37.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 10:08:38 compute-1 ceph-mon[80107]: pgmap v792: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 34 op/s
Jan 26 10:08:38 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:08:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:08:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:08:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:08:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:39 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:08:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:39.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:39.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:39 compute-1 nova_compute[226322]: 2026-01-26 10:08:39.388 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:40 compute-1 sudo[231084]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:08:40 compute-1 sudo[231084]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:08:40 compute-1 sudo[231084]: pam_unix(sudo:session): session closed for user root
Jan 26 10:08:40 compute-1 sudo[231109]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:08:40 compute-1 sudo[231109]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:08:40 compute-1 ceph-mon[80107]: pgmap v793: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 34 op/s
Jan 26 10:08:40 compute-1 sudo[231109]: pam_unix(sudo:session): session closed for user root
Jan 26 10:08:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:41.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:41.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:41 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:08:41 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:08:41 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:08:41 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:08:41 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:08:41 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:08:41 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:08:41 compute-1 nova_compute[226322]: 2026-01-26 10:08:41.994 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:42 compute-1 sudo[231165]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:08:42 compute-1 sudo[231165]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:08:42 compute-1 sudo[231165]: pam_unix(sudo:session): session closed for user root
Jan 26 10:08:42 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 26 10:08:42 compute-1 ceph-mon[80107]: pgmap v794: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 40 op/s
Jan 26 10:08:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:43.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:43.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:43 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:08:43 compute-1 ceph-mon[80107]: pgmap v795: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 39 op/s
Jan 26 10:08:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:08:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:08:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:08:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:44 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:08:44 compute-1 podman[231192]: 2026-01-26 10:08:44.279471278 +0000 UTC m=+0.055624986 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 26 10:08:44 compute-1 ovn_controller[133670]: 2026-01-26T10:08:44Z|00036|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 26 10:08:44 compute-1 nova_compute[226322]: 2026-01-26 10:08:44.390 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:45.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:45.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:45 compute-1 sudo[231212]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:08:45 compute-1 sudo[231212]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:08:45 compute-1 sudo[231212]: pam_unix(sudo:session): session closed for user root
Jan 26 10:08:46 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:08:46 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:08:46 compute-1 ceph-mon[80107]: pgmap v796: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 3.6 MiB/s rd, 1.8 MiB/s wr, 108 op/s
Jan 26 10:08:46 compute-1 nova_compute[226322]: 2026-01-26 10:08:46.996 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:47.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:47.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:48 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:08:48 compute-1 ceph-mon[80107]: pgmap v797: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 26 10:08:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:08:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:08:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:08:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:49 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:08:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 10:08:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:49.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 10:08:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:49.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:49 compute-1 nova_compute[226322]: 2026-01-26 10:08:49.431 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:08:50 compute-1 ceph-mon[80107]: pgmap v798: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 26 10:08:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:51.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:51.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:51 compute-1 nova_compute[226322]: 2026-01-26 10:08:51.998 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:52 compute-1 ceph-mon[80107]: pgmap v799: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 26 10:08:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:53.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 10:08:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:53.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 10:08:53 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:08:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:08:53.931 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:08:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:08:53.933 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:08:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:08:53.933 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:08:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:08:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:08:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:08:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:54 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:08:54 compute-1 ceph-mon[80107]: pgmap v800: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 69 op/s
Jan 26 10:08:54 compute-1 nova_compute[226322]: 2026-01-26 10:08:54.433 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:55.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:55.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:56 compute-1 ceph-mon[80107]: pgmap v801: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 130 op/s
Jan 26 10:08:57 compute-1 nova_compute[226322]: 2026-01-26 10:08:57.001 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:08:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 10:08:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:57.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 10:08:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:57.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:08:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:08:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:08:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:08:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:59 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:08:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:59.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:08:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:08:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:59.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:08:59 compute-1 ceph-mon[80107]: pgmap v802: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 269 KiB/s rd, 2.1 MiB/s wr, 61 op/s
Jan 26 10:08:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/1772955815' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:08:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/1772955815' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:08:59 compute-1 nova_compute[226322]: 2026-01-26 10:08:59.471 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:00 compute-1 ceph-mon[80107]: pgmap v803: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 269 KiB/s rd, 2.1 MiB/s wr, 61 op/s
Jan 26 10:09:00 compute-1 sshd-session[231246]: Connection closed by authenticating user root 178.62.249.31 port 51490 [preauth]
Jan 26 10:09:00 compute-1 podman[231249]: 2026-01-26 10:09:00.306931041 +0000 UTC m=+0.089330055 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 10:09:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 10:09:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:01.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 10:09:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:01.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:02 compute-1 nova_compute[226322]: 2026-01-26 10:09:02.002 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:02 compute-1 ceph-mon[80107]: pgmap v804: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 269 KiB/s rd, 2.1 MiB/s wr, 61 op/s
Jan 26 10:09:02 compute-1 sudo[231278]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:09:02 compute-1 sudo[231278]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:09:02 compute-1 sudo[231278]: pam_unix(sudo:session): session closed for user root
Jan 26 10:09:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:03.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:03.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:03 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:09:03 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:03.823 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 10:09:03 compute-1 nova_compute[226322]: 2026-01-26 10:09:03.824 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:03 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:03.824 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 10:09:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:09:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:09:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:09:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:04 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:09:04 compute-1 nova_compute[226322]: 2026-01-26 10:09:04.474 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:04 compute-1 ceph-mon[80107]: pgmap v805: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 269 KiB/s rd, 2.1 MiB/s wr, 61 op/s
Jan 26 10:09:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:09:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:05.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 10:09:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:05.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 10:09:06 compute-1 ceph-mon[80107]: pgmap v806: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 269 KiB/s rd, 2.1 MiB/s wr, 61 op/s
Jan 26 10:09:07 compute-1 nova_compute[226322]: 2026-01-26 10:09:07.004 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:07.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 10:09:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:07.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 10:09:08 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:09:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:09 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:09:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:09 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:09:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:09 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:09:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:09 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:09:09 compute-1 ceph-mon[80107]: pgmap v807: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 12 KiB/s wr, 1 op/s
Jan 26 10:09:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 10:09:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:09.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 10:09:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:09.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:09 compute-1 nova_compute[226322]: 2026-01-26 10:09:09.527 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:10 compute-1 ceph-mon[80107]: pgmap v808: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 12 KiB/s wr, 1 op/s
Jan 26 10:09:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:11.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:11.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:12 compute-1 nova_compute[226322]: 2026-01-26 10:09:12.055 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:12 compute-1 sshd-session[231308]: Invalid user admin from 152.42.136.110 port 35170
Jan 26 10:09:12 compute-1 ceph-mon[80107]: pgmap v809: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 14 KiB/s wr, 1 op/s
Jan 26 10:09:12 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:12.827 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:09:12 compute-1 sshd-session[231308]: Connection closed by invalid user admin 152.42.136.110 port 35170 [preauth]
Jan 26 10:09:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 10:09:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:13.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 10:09:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:13.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:13 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:09:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:09:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:09:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:09:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:14 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:09:14 compute-1 nova_compute[226322]: 2026-01-26 10:09:14.531 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:14 compute-1 ceph-mon[80107]: pgmap v810: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 2.0 KiB/s wr, 0 op/s
Jan 26 10:09:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 10:09:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:15.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 10:09:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:15.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:15 compute-1 podman[231311]: 2026-01-26 10:09:15.282494746 +0000 UTC m=+0.054357408 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 10:09:16 compute-1 ceph-mon[80107]: pgmap v811: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 3.0 KiB/s wr, 1 op/s
Jan 26 10:09:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 10:09:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:17.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 10:09:17 compute-1 nova_compute[226322]: 2026-01-26 10:09:17.093 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:17.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:17 compute-1 sshd-session[231332]: Invalid user centos from 104.248.198.71 port 57130
Jan 26 10:09:17 compute-1 sshd-session[231332]: Connection closed by invalid user centos 104.248.198.71 port 57130 [preauth]
Jan 26 10:09:18 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:09:18 compute-1 ceph-mon[80107]: pgmap v812: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 3.0 KiB/s wr, 0 op/s
Jan 26 10:09:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:09:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:09:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:09:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:09:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 10:09:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:19.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 10:09:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:19.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:19 compute-1 nova_compute[226322]: 2026-01-26 10:09:19.600 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:09:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 10:09:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:21.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 10:09:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:21.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:21 compute-1 ceph-mon[80107]: pgmap v813: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 3.0 KiB/s wr, 0 op/s
Jan 26 10:09:22 compute-1 nova_compute[226322]: 2026-01-26 10:09:22.146 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:22 compute-1 ceph-mon[80107]: pgmap v814: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 7.2 KiB/s rd, 7.0 KiB/s wr, 2 op/s
Jan 26 10:09:23 compute-1 sudo[231337]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:09:23 compute-1 sudo[231337]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:09:23 compute-1 sudo[231337]: pam_unix(sudo:session): session closed for user root
Jan 26 10:09:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:23.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:23.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:23 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:09:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:09:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:09:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:09:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:09:24 compute-1 nova_compute[226322]: 2026-01-26 10:09:24.601 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:24 compute-1 ceph-mon[80107]: pgmap v815: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 7.0 KiB/s rd, 5.0 KiB/s wr, 1 op/s
Jan 26 10:09:24 compute-1 nova_compute[226322]: 2026-01-26 10:09:24.682 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:09:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:25.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:25.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:25 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1303242856' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:09:26 compute-1 ceph-mon[80107]: pgmap v816: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 6.2 KiB/s wr, 29 op/s
Jan 26 10:09:26 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1429236293' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:09:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 10:09:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:27.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 10:09:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 10:09:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:27.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 10:09:27 compute-1 nova_compute[226322]: 2026-01-26 10:09:27.147 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:28 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:09:28 compute-1 nova_compute[226322]: 2026-01-26 10:09:28.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:09:28 compute-1 ceph-mon[80107]: pgmap v817: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 5.2 KiB/s wr, 29 op/s
Jan 26 10:09:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:09:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:09:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:09:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:29 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:09:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:29.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:29.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:29 compute-1 nova_compute[226322]: 2026-01-26 10:09:29.603 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:29 compute-1 nova_compute[226322]: 2026-01-26 10:09:29.680 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:09:29 compute-1 nova_compute[226322]: 2026-01-26 10:09:29.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:09:29 compute-1 nova_compute[226322]: 2026-01-26 10:09:29.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:09:29 compute-1 nova_compute[226322]: 2026-01-26 10:09:29.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:09:29 compute-1 nova_compute[226322]: 2026-01-26 10:09:29.707 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 10:09:29 compute-1 nova_compute[226322]: 2026-01-26 10:09:29.708 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:09:29 compute-1 nova_compute[226322]: 2026-01-26 10:09:29.708 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:09:29 compute-1 nova_compute[226322]: 2026-01-26 10:09:29.708 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:09:29 compute-1 nova_compute[226322]: 2026-01-26 10:09:29.709 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:09:29 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2035187687' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:09:30 compute-1 nova_compute[226322]: 2026-01-26 10:09:30.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:09:30 compute-1 nova_compute[226322]: 2026-01-26 10:09:30.707 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:09:30 compute-1 nova_compute[226322]: 2026-01-26 10:09:30.708 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:09:30 compute-1 nova_compute[226322]: 2026-01-26 10:09:30.708 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:09:30 compute-1 nova_compute[226322]: 2026-01-26 10:09:30.708 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:09:30 compute-1 nova_compute[226322]: 2026-01-26 10:09:30.709 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:09:30 compute-1 ceph-mon[80107]: pgmap v818: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 5.2 KiB/s wr, 29 op/s
Jan 26 10:09:30 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2407901453' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:09:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:31.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:31.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:31 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:09:31 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1839213161' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:09:31 compute-1 nova_compute[226322]: 2026-01-26 10:09:31.152 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:09:31 compute-1 podman[231388]: 2026-01-26 10:09:31.303449877 +0000 UTC m=+0.078947526 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 26 10:09:31 compute-1 nova_compute[226322]: 2026-01-26 10:09:31.325 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:09:31 compute-1 nova_compute[226322]: 2026-01-26 10:09:31.326 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4945MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:09:31 compute-1 nova_compute[226322]: 2026-01-26 10:09:31.326 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:09:31 compute-1 nova_compute[226322]: 2026-01-26 10:09:31.327 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:09:31 compute-1 nova_compute[226322]: 2026-01-26 10:09:31.483 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:09:31 compute-1 nova_compute[226322]: 2026-01-26 10:09:31.483 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:09:31 compute-1 nova_compute[226322]: 2026-01-26 10:09:31.496 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:09:31 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1839213161' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:09:31 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:09:31 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/267005787' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:09:31 compute-1 nova_compute[226322]: 2026-01-26 10:09:31.954 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:09:31 compute-1 nova_compute[226322]: 2026-01-26 10:09:31.959 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:09:31 compute-1 nova_compute[226322]: 2026-01-26 10:09:31.980 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:09:31 compute-1 nova_compute[226322]: 2026-01-26 10:09:31.981 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:09:31 compute-1 nova_compute[226322]: 2026-01-26 10:09:31.982 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:09:32 compute-1 nova_compute[226322]: 2026-01-26 10:09:32.149 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:32 compute-1 ceph-mon[80107]: pgmap v819: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 5.2 KiB/s wr, 29 op/s
Jan 26 10:09:32 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/267005787' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:09:32 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1818352728' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:09:32 compute-1 nova_compute[226322]: 2026-01-26 10:09:32.981 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:09:32 compute-1 nova_compute[226322]: 2026-01-26 10:09:32.982 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:09:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 10:09:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:33.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 10:09:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:33.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:33 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:09:33 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2260971169' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:09:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:09:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:09:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:09:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:09:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:34 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:09:34 compute-1 nova_compute[226322]: 2026-01-26 10:09:34.604 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:34 compute-1 ceph-mon[80107]: pgmap v820: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Jan 26 10:09:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 10:09:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:35.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 10:09:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:35.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:37.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:37 compute-1 ceph-mon[80107]: pgmap v821: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Jan 26 10:09:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:37.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:37 compute-1 nova_compute[226322]: 2026-01-26 10:09:37.151 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:38 compute-1 ceph-mon[80107]: pgmap v822: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 0 op/s
Jan 26 10:09:38 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:09:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:09:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:09:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:09:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:39 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:09:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:39.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 10:09:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:39.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 10:09:39 compute-1 nova_compute[226322]: 2026-01-26 10:09:39.606 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:40 compute-1 ceph-mon[80107]: pgmap v823: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 0 op/s
Jan 26 10:09:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:41.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:41.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:42 compute-1 nova_compute[226322]: 2026-01-26 10:09:42.154 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:42 compute-1 ceph-mon[80107]: pgmap v824: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 0 op/s
Jan 26 10:09:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:43.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:43 compute-1 sudo[231444]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:09:43 compute-1 sudo[231444]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:09:43 compute-1 sudo[231444]: pam_unix(sudo:session): session closed for user root
Jan 26 10:09:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:43.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:43 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:09:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:09:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:09:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:09:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:44 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:09:44 compute-1 ceph-mon[80107]: pgmap v825: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 0 op/s
Jan 26 10:09:44 compute-1 nova_compute[226322]: 2026-01-26 10:09:44.651 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:44 compute-1 sshd-session[231443]: Connection closed by authenticating user root 178.62.249.31 port 35338 [preauth]
Jan 26 10:09:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:45.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:45.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:45 compute-1 sudo[231472]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:09:45 compute-1 sudo[231472]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:09:45 compute-1 sudo[231472]: pam_unix(sudo:session): session closed for user root
Jan 26 10:09:45 compute-1 sudo[231503]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:09:45 compute-1 sudo[231503]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:09:45 compute-1 podman[231496]: 2026-01-26 10:09:45.668606711 +0000 UTC m=+0.049415888 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 10:09:46 compute-1 sudo[231503]: pam_unix(sudo:session): session closed for user root
Jan 26 10:09:46 compute-1 ceph-mon[80107]: pgmap v826: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 1 op/s
Jan 26 10:09:46 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:09:46 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:09:46 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:09:46 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:09:46 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:09:47 compute-1 nova_compute[226322]: 2026-01-26 10:09:47.095 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:09:47 compute-1 nova_compute[226322]: 2026-01-26 10:09:47.095 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:09:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:47.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:47 compute-1 nova_compute[226322]: 2026-01-26 10:09:47.117 226326 DEBUG nova.compute.manager [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 10:09:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:47.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:47 compute-1 nova_compute[226322]: 2026-01-26 10:09:47.180 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:47 compute-1 nova_compute[226322]: 2026-01-26 10:09:47.197 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:09:47 compute-1 nova_compute[226322]: 2026-01-26 10:09:47.198 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:09:47 compute-1 nova_compute[226322]: 2026-01-26 10:09:47.202 226326 DEBUG nova.virt.hardware [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 10:09:47 compute-1 nova_compute[226322]: 2026-01-26 10:09:47.202 226326 INFO nova.compute.claims [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Claim successful on node compute-1.ctlplane.example.com
Jan 26 10:09:47 compute-1 nova_compute[226322]: 2026-01-26 10:09:47.313 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:09:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:09:47 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1238899685' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:09:47 compute-1 nova_compute[226322]: 2026-01-26 10:09:47.767 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:09:47 compute-1 nova_compute[226322]: 2026-01-26 10:09:47.772 226326 DEBUG nova.compute.provider_tree [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:09:47 compute-1 nova_compute[226322]: 2026-01-26 10:09:47.795 226326 DEBUG nova.scheduler.client.report [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:09:47 compute-1 nova_compute[226322]: 2026-01-26 10:09:47.826 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:09:47 compute-1 nova_compute[226322]: 2026-01-26 10:09:47.827 226326 DEBUG nova.compute.manager [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 10:09:47 compute-1 nova_compute[226322]: 2026-01-26 10:09:47.884 226326 DEBUG nova.compute.manager [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 10:09:47 compute-1 nova_compute[226322]: 2026-01-26 10:09:47.885 226326 DEBUG nova.network.neutron [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 10:09:47 compute-1 nova_compute[226322]: 2026-01-26 10:09:47.906 226326 INFO nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 10:09:47 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:09:47 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:09:47 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:09:47 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:09:47 compute-1 nova_compute[226322]: 2026-01-26 10:09:47.929 226326 DEBUG nova.compute.manager [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 10:09:48 compute-1 nova_compute[226322]: 2026-01-26 10:09:48.013 226326 DEBUG nova.compute.manager [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 10:09:48 compute-1 nova_compute[226322]: 2026-01-26 10:09:48.015 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 10:09:48 compute-1 nova_compute[226322]: 2026-01-26 10:09:48.016 226326 INFO nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Creating image(s)
Jan 26 10:09:48 compute-1 nova_compute[226322]: 2026-01-26 10:09:48.054 226326 DEBUG nova.storage.rbd_utils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:09:48 compute-1 nova_compute[226322]: 2026-01-26 10:09:48.081 226326 DEBUG nova.storage.rbd_utils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:09:48 compute-1 nova_compute[226322]: 2026-01-26 10:09:48.107 226326 DEBUG nova.storage.rbd_utils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:09:48 compute-1 nova_compute[226322]: 2026-01-26 10:09:48.111 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:09:48 compute-1 nova_compute[226322]: 2026-01-26 10:09:48.186 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:09:48 compute-1 nova_compute[226322]: 2026-01-26 10:09:48.187 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "d81880e926e175d0cc7241caa7cc18231a8a289c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:09:48 compute-1 nova_compute[226322]: 2026-01-26 10:09:48.188 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "d81880e926e175d0cc7241caa7cc18231a8a289c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:09:48 compute-1 nova_compute[226322]: 2026-01-26 10:09:48.188 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "d81880e926e175d0cc7241caa7cc18231a8a289c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:09:48 compute-1 nova_compute[226322]: 2026-01-26 10:09:48.214 226326 DEBUG nova.storage.rbd_utils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:09:48 compute-1 nova_compute[226322]: 2026-01-26 10:09:48.217 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:09:48 compute-1 nova_compute[226322]: 2026-01-26 10:09:48.477 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.260s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:09:48 compute-1 nova_compute[226322]: 2026-01-26 10:09:48.542 226326 DEBUG nova.storage.rbd_utils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] resizing rbd image eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Jan 26 10:09:48 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:09:48 compute-1 nova_compute[226322]: 2026-01-26 10:09:48.646 226326 DEBUG nova.objects.instance [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'migration_context' on Instance uuid eafb89b3-bd76-4bd7-9ba0-c169e9e02d17 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 10:09:48 compute-1 nova_compute[226322]: 2026-01-26 10:09:48.670 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 26 10:09:48 compute-1 nova_compute[226322]: 2026-01-26 10:09:48.670 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Ensure instance console log exists: /var/lib/nova/instances/eafb89b3-bd76-4bd7-9ba0-c169e9e02d17/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 26 10:09:48 compute-1 nova_compute[226322]: 2026-01-26 10:09:48.671 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:09:48 compute-1 nova_compute[226322]: 2026-01-26 10:09:48.671 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:09:48 compute-1 nova_compute[226322]: 2026-01-26 10:09:48.671 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:09:48 compute-1 nova_compute[226322]: 2026-01-26 10:09:48.692 226326 DEBUG nova.policy [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c1208d3e25b940ea93fe76884c7a53db', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 10:09:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:09:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:09:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:09:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:49 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:09:49 compute-1 ceph-mon[80107]: pgmap v827: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 0 op/s
Jan 26 10:09:49 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1238899685' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:09:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:09:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:49.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:49.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:49 compute-1 nova_compute[226322]: 2026-01-26 10:09:49.677 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:50 compute-1 ceph-mon[80107]: pgmap v828: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 0 op/s
Jan 26 10:09:50 compute-1 nova_compute[226322]: 2026-01-26 10:09:50.806 226326 DEBUG nova.network.neutron [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Successfully created port: a3edb0bf-fece-4486-bde7-200b47a3207f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 10:09:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 10:09:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:51.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 10:09:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:51.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:52 compute-1 nova_compute[226322]: 2026-01-26 10:09:52.182 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:52 compute-1 nova_compute[226322]: 2026-01-26 10:09:52.372 226326 DEBUG nova.network.neutron [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Successfully updated port: a3edb0bf-fece-4486-bde7-200b47a3207f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 10:09:52 compute-1 nova_compute[226322]: 2026-01-26 10:09:52.398 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 10:09:52 compute-1 nova_compute[226322]: 2026-01-26 10:09:52.399 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquired lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 10:09:52 compute-1 nova_compute[226322]: 2026-01-26 10:09:52.399 226326 DEBUG nova.network.neutron [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 10:09:52 compute-1 nova_compute[226322]: 2026-01-26 10:09:52.523 226326 DEBUG nova.compute.manager [req-78ef31bd-333d-4c21-8bbe-825be9d9f479 req-30aa49d8-3ff1-41ec-9115-5c13badcc7d1 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Received event network-changed-a3edb0bf-fece-4486-bde7-200b47a3207f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:09:52 compute-1 nova_compute[226322]: 2026-01-26 10:09:52.524 226326 DEBUG nova.compute.manager [req-78ef31bd-333d-4c21-8bbe-825be9d9f479 req-30aa49d8-3ff1-41ec-9115-5c13badcc7d1 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Refreshing instance network info cache due to event network-changed-a3edb0bf-fece-4486-bde7-200b47a3207f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 10:09:52 compute-1 nova_compute[226322]: 2026-01-26 10:09:52.524 226326 DEBUG oslo_concurrency.lockutils [req-78ef31bd-333d-4c21-8bbe-825be9d9f479 req-30aa49d8-3ff1-41ec-9115-5c13badcc7d1 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 10:09:52 compute-1 nova_compute[226322]: 2026-01-26 10:09:52.658 226326 DEBUG nova.network.neutron [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 10:09:52 compute-1 ceph-mon[80107]: pgmap v829: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 26 10:09:53 compute-1 sshd-session[231764]: Invalid user admin from 152.42.136.110 port 47962
Jan 26 10:09:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 10:09:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:53.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 10:09:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 10:09:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:53.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 10:09:53 compute-1 sshd-session[231764]: Connection closed by invalid user admin 152.42.136.110 port 47962 [preauth]
Jan 26 10:09:53 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:09:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:53.933 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:09:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:53.934 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:09:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:53.934 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:09:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:09:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:09:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:09:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:54 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.014 226326 DEBUG nova.network.neutron [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Updating instance_info_cache with network_info: [{"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.039 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Releasing lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.039 226326 DEBUG nova.compute.manager [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Instance network_info: |[{"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.040 226326 DEBUG oslo_concurrency.lockutils [req-78ef31bd-333d-4c21-8bbe-825be9d9f479 req-30aa49d8-3ff1-41ec-9115-5c13badcc7d1 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquired lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.040 226326 DEBUG nova.network.neutron [req-78ef31bd-333d-4c21-8bbe-825be9d9f479 req-30aa49d8-3ff1-41ec-9115-5c13badcc7d1 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Refreshing network info cache for port a3edb0bf-fece-4486-bde7-200b47a3207f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.043 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Start _get_guest_xml network_info=[{"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T10:05:39Z,direct_url=<?>,disk_format='qcow2',id=6789692f-fc1f-4efa-ae75-dcc13be695ef,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3ff3fa2a5531460b993c609589aa545d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T10:05:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'size': 0, 'image_id': '6789692f-fc1f-4efa-ae75-dcc13be695ef'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.046 226326 WARNING nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.052 226326 DEBUG nova.virt.libvirt.host [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.053 226326 DEBUG nova.virt.libvirt.host [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.055 226326 DEBUG nova.virt.libvirt.host [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.056 226326 DEBUG nova.virt.libvirt.host [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.056 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.056 226326 DEBUG nova.virt.hardware [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T10:05:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='57e1601b-dbfa-4d3b-8b96-27302e4a7a06',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T10:05:39Z,direct_url=<?>,disk_format='qcow2',id=6789692f-fc1f-4efa-ae75-dcc13be695ef,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3ff3fa2a5531460b993c609589aa545d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T10:05:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.057 226326 DEBUG nova.virt.hardware [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.057 226326 DEBUG nova.virt.hardware [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.057 226326 DEBUG nova.virt.hardware [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.057 226326 DEBUG nova.virt.hardware [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.058 226326 DEBUG nova.virt.hardware [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.058 226326 DEBUG nova.virt.hardware [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.058 226326 DEBUG nova.virt.hardware [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.058 226326 DEBUG nova.virt.hardware [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.058 226326 DEBUG nova.virt.hardware [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.058 226326 DEBUG nova.virt.hardware [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.061 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:09:54 compute-1 ceph-mon[80107]: pgmap v830: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 26 10:09:54 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 26 10:09:54 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1908295695' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.489 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.510 226326 DEBUG nova.storage.rbd_utils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.513 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:09:54 compute-1 sudo[231808]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:09:54 compute-1 sudo[231808]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:09:54 compute-1 sudo[231808]: pam_unix(sudo:session): session closed for user root
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.677 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:54 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 26 10:09:54 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2437790993' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.958 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.959 226326 DEBUG nova.virt.libvirt.vif [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T10:09:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1372750228',display_name='tempest-TestNetworkBasicOps-server-1372750228',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1372750228',id=4,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMsvtjjj+6XI+gg1E4Xzo+HtbXxyF3peabH1GoeKNDwXTbi9v3JnK6KlG1vj7QQfwaNJGTtKqqAkDphs9wj+ke0O+NbixfA149oWC+Na1rLDGTUpAAOvTtxkuWhA6lVC4w==',key_name='tempest-TestNetworkBasicOps-796997991',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-318o5yar',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T10:09:47Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=eafb89b3-bd76-4bd7-9ba0-c169e9e02d17,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.959 226326 DEBUG nova.network.os_vif_util [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.960 226326 DEBUG nova.network.os_vif_util [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:93:05:f9,bridge_name='br-int',has_traffic_filtering=True,id=a3edb0bf-fece-4486-bde7-200b47a3207f,network=Network(586582bb-e1bc-4aa4-9785-243b8df2dbcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3edb0bf-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.961 226326 DEBUG nova.objects.instance [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'pci_devices' on Instance uuid eafb89b3-bd76-4bd7-9ba0-c169e9e02d17 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.982 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] End _get_guest_xml xml=<domain type="kvm">
Jan 26 10:09:54 compute-1 nova_compute[226322]:   <uuid>eafb89b3-bd76-4bd7-9ba0-c169e9e02d17</uuid>
Jan 26 10:09:54 compute-1 nova_compute[226322]:   <name>instance-00000004</name>
Jan 26 10:09:54 compute-1 nova_compute[226322]:   <memory>131072</memory>
Jan 26 10:09:54 compute-1 nova_compute[226322]:   <vcpu>1</vcpu>
Jan 26 10:09:54 compute-1 nova_compute[226322]:   <metadata>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <nova:name>tempest-TestNetworkBasicOps-server-1372750228</nova:name>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <nova:creationTime>2026-01-26 10:09:54</nova:creationTime>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <nova:flavor name="m1.nano">
Jan 26 10:09:54 compute-1 nova_compute[226322]:         <nova:memory>128</nova:memory>
Jan 26 10:09:54 compute-1 nova_compute[226322]:         <nova:disk>1</nova:disk>
Jan 26 10:09:54 compute-1 nova_compute[226322]:         <nova:swap>0</nova:swap>
Jan 26 10:09:54 compute-1 nova_compute[226322]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 10:09:54 compute-1 nova_compute[226322]:         <nova:vcpus>1</nova:vcpus>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       </nova:flavor>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <nova:owner>
Jan 26 10:09:54 compute-1 nova_compute[226322]:         <nova:user uuid="c1208d3e25b940ea93fe76884c7a53db">tempest-TestNetworkBasicOps-966559857-project-member</nova:user>
Jan 26 10:09:54 compute-1 nova_compute[226322]:         <nova:project uuid="6ed221b375a44fc2bb2a8f232c5446e7">tempest-TestNetworkBasicOps-966559857</nova:project>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       </nova:owner>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <nova:root type="image" uuid="6789692f-fc1f-4efa-ae75-dcc13be695ef"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <nova:ports>
Jan 26 10:09:54 compute-1 nova_compute[226322]:         <nova:port uuid="a3edb0bf-fece-4486-bde7-200b47a3207f">
Jan 26 10:09:54 compute-1 nova_compute[226322]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:         </nova:port>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       </nova:ports>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     </nova:instance>
Jan 26 10:09:54 compute-1 nova_compute[226322]:   </metadata>
Jan 26 10:09:54 compute-1 nova_compute[226322]:   <sysinfo type="smbios">
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <system>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <entry name="manufacturer">RDO</entry>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <entry name="product">OpenStack Compute</entry>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <entry name="serial">eafb89b3-bd76-4bd7-9ba0-c169e9e02d17</entry>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <entry name="uuid">eafb89b3-bd76-4bd7-9ba0-c169e9e02d17</entry>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <entry name="family">Virtual Machine</entry>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     </system>
Jan 26 10:09:54 compute-1 nova_compute[226322]:   </sysinfo>
Jan 26 10:09:54 compute-1 nova_compute[226322]:   <os>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <boot dev="hd"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <smbios mode="sysinfo"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:   </os>
Jan 26 10:09:54 compute-1 nova_compute[226322]:   <features>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <acpi/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <apic/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <vmcoreinfo/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:   </features>
Jan 26 10:09:54 compute-1 nova_compute[226322]:   <clock offset="utc">
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <timer name="hpet" present="no"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:   </clock>
Jan 26 10:09:54 compute-1 nova_compute[226322]:   <cpu mode="host-model" match="exact">
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:   </cpu>
Jan 26 10:09:54 compute-1 nova_compute[226322]:   <devices>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <disk type="network" device="disk">
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <driver type="raw" cache="none"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <source protocol="rbd" name="vms/eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk">
Jan 26 10:09:54 compute-1 nova_compute[226322]:         <host name="192.168.122.100" port="6789"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:         <host name="192.168.122.102" port="6789"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:         <host name="192.168.122.101" port="6789"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       </source>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <auth username="openstack">
Jan 26 10:09:54 compute-1 nova_compute[226322]:         <secret type="ceph" uuid="1a70b85d-e3fd-5814-8a6a-37ea00fcae30"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       </auth>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <target dev="vda" bus="virtio"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     </disk>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <disk type="network" device="cdrom">
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <driver type="raw" cache="none"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <source protocol="rbd" name="vms/eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk.config">
Jan 26 10:09:54 compute-1 nova_compute[226322]:         <host name="192.168.122.100" port="6789"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:         <host name="192.168.122.102" port="6789"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:         <host name="192.168.122.101" port="6789"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       </source>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <auth username="openstack">
Jan 26 10:09:54 compute-1 nova_compute[226322]:         <secret type="ceph" uuid="1a70b85d-e3fd-5814-8a6a-37ea00fcae30"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       </auth>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <target dev="sda" bus="sata"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     </disk>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <interface type="ethernet">
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <mac address="fa:16:3e:93:05:f9"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <model type="virtio"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <mtu size="1442"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <target dev="tapa3edb0bf-fe"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     </interface>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <serial type="pty">
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <log file="/var/lib/nova/instances/eafb89b3-bd76-4bd7-9ba0-c169e9e02d17/console.log" append="off"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     </serial>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <video>
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <model type="virtio"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     </video>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <input type="tablet" bus="usb"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <rng model="virtio">
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <backend model="random">/dev/urandom</backend>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     </rng>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <controller type="usb" index="0"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     <memballoon model="virtio">
Jan 26 10:09:54 compute-1 nova_compute[226322]:       <stats period="10"/>
Jan 26 10:09:54 compute-1 nova_compute[226322]:     </memballoon>
Jan 26 10:09:54 compute-1 nova_compute[226322]:   </devices>
Jan 26 10:09:54 compute-1 nova_compute[226322]: </domain>
Jan 26 10:09:54 compute-1 nova_compute[226322]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.984 226326 DEBUG nova.compute.manager [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Preparing to wait for external event network-vif-plugged-a3edb0bf-fece-4486-bde7-200b47a3207f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.985 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.985 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.986 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.987 226326 DEBUG nova.virt.libvirt.vif [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T10:09:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1372750228',display_name='tempest-TestNetworkBasicOps-server-1372750228',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1372750228',id=4,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMsvtjjj+6XI+gg1E4Xzo+HtbXxyF3peabH1GoeKNDwXTbi9v3JnK6KlG1vj7QQfwaNJGTtKqqAkDphs9wj+ke0O+NbixfA149oWC+Na1rLDGTUpAAOvTtxkuWhA6lVC4w==',key_name='tempest-TestNetworkBasicOps-796997991',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-318o5yar',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T10:09:47Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=eafb89b3-bd76-4bd7-9ba0-c169e9e02d17,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.988 226326 DEBUG nova.network.os_vif_util [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.989 226326 DEBUG nova.network.os_vif_util [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:93:05:f9,bridge_name='br-int',has_traffic_filtering=True,id=a3edb0bf-fece-4486-bde7-200b47a3207f,network=Network(586582bb-e1bc-4aa4-9785-243b8df2dbcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3edb0bf-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.989 226326 DEBUG os_vif [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:93:05:f9,bridge_name='br-int',has_traffic_filtering=True,id=a3edb0bf-fece-4486-bde7-200b47a3207f,network=Network(586582bb-e1bc-4aa4-9785-243b8df2dbcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3edb0bf-fe') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.990 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.991 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.992 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.997 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.997 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa3edb0bf-fe, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:09:54 compute-1 nova_compute[226322]: 2026-01-26 10:09:54.999 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa3edb0bf-fe, col_values=(('external_ids', {'iface-id': 'a3edb0bf-fece-4486-bde7-200b47a3207f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:93:05:f9', 'vm-uuid': 'eafb89b3-bd76-4bd7-9ba0-c169e9e02d17'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:09:55 compute-1 NetworkManager[49073]: <info>  [1769422195.0019] manager: (tapa3edb0bf-fe): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Jan 26 10:09:55 compute-1 nova_compute[226322]: 2026-01-26 10:09:55.002 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:55 compute-1 nova_compute[226322]: 2026-01-26 10:09:55.005 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:09:55 compute-1 nova_compute[226322]: 2026-01-26 10:09:55.008 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:55 compute-1 nova_compute[226322]: 2026-01-26 10:09:55.009 226326 INFO os_vif [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:93:05:f9,bridge_name='br-int',has_traffic_filtering=True,id=a3edb0bf-fece-4486-bde7-200b47a3207f,network=Network(586582bb-e1bc-4aa4-9785-243b8df2dbcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3edb0bf-fe')
Jan 26 10:09:55 compute-1 nova_compute[226322]: 2026-01-26 10:09:55.061 226326 DEBUG nova.network.neutron [req-78ef31bd-333d-4c21-8bbe-825be9d9f479 req-30aa49d8-3ff1-41ec-9115-5c13badcc7d1 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Updated VIF entry in instance network info cache for port a3edb0bf-fece-4486-bde7-200b47a3207f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 10:09:55 compute-1 nova_compute[226322]: 2026-01-26 10:09:55.061 226326 DEBUG nova.network.neutron [req-78ef31bd-333d-4c21-8bbe-825be9d9f479 req-30aa49d8-3ff1-41ec-9115-5c13badcc7d1 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Updating instance_info_cache with network_info: [{"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:09:55 compute-1 nova_compute[226322]: 2026-01-26 10:09:55.064 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 10:09:55 compute-1 nova_compute[226322]: 2026-01-26 10:09:55.065 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 10:09:55 compute-1 nova_compute[226322]: 2026-01-26 10:09:55.065 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No VIF found with MAC fa:16:3e:93:05:f9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 26 10:09:55 compute-1 nova_compute[226322]: 2026-01-26 10:09:55.065 226326 INFO nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Using config drive
Jan 26 10:09:55 compute-1 nova_compute[226322]: 2026-01-26 10:09:55.087 226326 DEBUG nova.storage.rbd_utils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:09:55 compute-1 nova_compute[226322]: 2026-01-26 10:09:55.091 226326 DEBUG oslo_concurrency.lockutils [req-78ef31bd-333d-4c21-8bbe-825be9d9f479 req-30aa49d8-3ff1-41ec-9115-5c13badcc7d1 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Releasing lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 10:09:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:55.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:55.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:55 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:09:55 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:09:55 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1908295695' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:09:55 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2437790993' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:09:55 compute-1 nova_compute[226322]: 2026-01-26 10:09:55.667 226326 INFO nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Creating config drive at /var/lib/nova/instances/eafb89b3-bd76-4bd7-9ba0-c169e9e02d17/disk.config
Jan 26 10:09:55 compute-1 nova_compute[226322]: 2026-01-26 10:09:55.673 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/eafb89b3-bd76-4bd7-9ba0-c169e9e02d17/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpre49voty execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:09:55 compute-1 nova_compute[226322]: 2026-01-26 10:09:55.807 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/eafb89b3-bd76-4bd7-9ba0-c169e9e02d17/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpre49voty" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:09:55 compute-1 nova_compute[226322]: 2026-01-26 10:09:55.832 226326 DEBUG nova.storage.rbd_utils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:09:55 compute-1 nova_compute[226322]: 2026-01-26 10:09:55.835 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/eafb89b3-bd76-4bd7-9ba0-c169e9e02d17/disk.config eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:09:56 compute-1 nova_compute[226322]: 2026-01-26 10:09:56.372 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/eafb89b3-bd76-4bd7-9ba0-c169e9e02d17/disk.config eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.536s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:09:56 compute-1 nova_compute[226322]: 2026-01-26 10:09:56.373 226326 INFO nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Deleting local config drive /var/lib/nova/instances/eafb89b3-bd76-4bd7-9ba0-c169e9e02d17/disk.config because it was imported into RBD.
Jan 26 10:09:56 compute-1 systemd[1]: Starting libvirt secret daemon...
Jan 26 10:09:56 compute-1 systemd[1]: Started libvirt secret daemon.
Jan 26 10:09:56 compute-1 kernel: tapa3edb0bf-fe: entered promiscuous mode
Jan 26 10:09:56 compute-1 ovn_controller[133670]: 2026-01-26T10:09:56Z|00037|binding|INFO|Claiming lport a3edb0bf-fece-4486-bde7-200b47a3207f for this chassis.
Jan 26 10:09:56 compute-1 ovn_controller[133670]: 2026-01-26T10:09:56Z|00038|binding|INFO|a3edb0bf-fece-4486-bde7-200b47a3207f: Claiming fa:16:3e:93:05:f9 10.100.0.6
Jan 26 10:09:56 compute-1 NetworkManager[49073]: <info>  [1769422196.4976] manager: (tapa3edb0bf-fe): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Jan 26 10:09:56 compute-1 nova_compute[226322]: 2026-01-26 10:09:56.496 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:56 compute-1 nova_compute[226322]: 2026-01-26 10:09:56.501 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.509 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:93:05:f9 10.100.0.6'], port_security=['fa:16:3e:93:05:f9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'eafb89b3-bd76-4bd7-9ba0-c169e9e02d17', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-586582bb-e1bc-4aa4-9785-243b8df2dbcb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9fcc6dd6-8be1-499a-bb83-b80d1cff1d47', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=83c6eda3-fd53-4ca0-bad4-44f3e56aac7f, chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], logical_port=a3edb0bf-fece-4486-bde7-200b47a3207f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.510 143326 INFO neutron.agent.ovn.metadata.agent [-] Port a3edb0bf-fece-4486-bde7-200b47a3207f in datapath 586582bb-e1bc-4aa4-9785-243b8df2dbcb bound to our chassis
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.512 143326 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 586582bb-e1bc-4aa4-9785-243b8df2dbcb
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.522 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[3d8e7be7-fb53-41c9-9d54-9fe5edbc1d06]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.523 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap586582bb-e1 in ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.525 229912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap586582bb-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.525 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[a6f2a8c5-41f9-43c7-9d4a-5b4dbe308321]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:09:56 compute-1 systemd-machined[194876]: New machine qemu-2-instance-00000004.
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.526 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[4d89784d-fa91-4812-9869-c7a94ccc1de4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.536 143615 DEBUG oslo.privsep.daemon [-] privsep: reply[604ffeee-e557-47a2-87f3-db8c47502f4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:09:56 compute-1 systemd[1]: Started Virtual Machine qemu-2-instance-00000004.
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.562 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[c52612ec-d65f-46ef-a286-528269320c31]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:09:56 compute-1 systemd-udevd[231950]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 10:09:56 compute-1 nova_compute[226322]: 2026-01-26 10:09:56.571 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:56 compute-1 ovn_controller[133670]: 2026-01-26T10:09:56Z|00039|binding|INFO|Setting lport a3edb0bf-fece-4486-bde7-200b47a3207f ovn-installed in OVS
Jan 26 10:09:56 compute-1 ovn_controller[133670]: 2026-01-26T10:09:56Z|00040|binding|INFO|Setting lport a3edb0bf-fece-4486-bde7-200b47a3207f up in Southbound
Jan 26 10:09:56 compute-1 nova_compute[226322]: 2026-01-26 10:09:56.578 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:56 compute-1 NetworkManager[49073]: <info>  [1769422196.5826] device (tapa3edb0bf-fe): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 10:09:56 compute-1 NetworkManager[49073]: <info>  [1769422196.5839] device (tapa3edb0bf-fe): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.590 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[f1368e9b-7932-4abb-9ffb-df4f88f03788]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:09:56 compute-1 NetworkManager[49073]: <info>  [1769422196.5974] manager: (tap586582bb-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.596 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[314f8455-e4e3-46ef-958b-ddc7e0709f5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.621 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[e5aaad4e-cc65-48c3-8789-bc1586aee5c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.625 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[9bd4c300-c473-42cd-8c8f-6fcaab2ebfef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:09:56 compute-1 NetworkManager[49073]: <info>  [1769422196.6458] device (tap586582bb-e0): carrier: link connected
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.650 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[d73dcb94-1598-4fca-9ff7-b5003d00f751]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.665 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[212a0d26-34c2-4dd6-8760-86882d29ff9b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap586582bb-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:be:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415721, 'reachable_time': 21005, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231980, 'error': None, 'target': 'ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.678 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[456378b0-b768-4023-88e8-0b546758560d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe90:bea7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415721, 'tstamp': 415721}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231981, 'error': None, 'target': 'ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:09:56 compute-1 ceph-mon[80107]: pgmap v831: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.695 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[38c9f083-197b-4c88-8402-d8cd3f8b3a69]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap586582bb-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:be:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415721, 'reachable_time': 21005, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231982, 'error': None, 'target': 'ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.728 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[465182b2-359a-436f-be8b-d408b93a9efe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.782 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[6c35b87d-4700-426e-bbbd-7c0bb63b71c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.784 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap586582bb-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.784 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.784 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap586582bb-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:09:56 compute-1 nova_compute[226322]: 2026-01-26 10:09:56.786 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:56 compute-1 NetworkManager[49073]: <info>  [1769422196.7868] manager: (tap586582bb-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Jan 26 10:09:56 compute-1 kernel: tap586582bb-e0: entered promiscuous mode
Jan 26 10:09:56 compute-1 nova_compute[226322]: 2026-01-26 10:09:56.789 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.789 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap586582bb-e0, col_values=(('external_ids', {'iface-id': 'e7d448a7-b7a2-4f6b-a25b-09ac0981ee0c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:09:56 compute-1 nova_compute[226322]: 2026-01-26 10:09:56.790 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:56 compute-1 ovn_controller[133670]: 2026-01-26T10:09:56Z|00041|binding|INFO|Releasing lport e7d448a7-b7a2-4f6b-a25b-09ac0981ee0c from this chassis (sb_readonly=0)
Jan 26 10:09:56 compute-1 nova_compute[226322]: 2026-01-26 10:09:56.821 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.822 143326 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/586582bb-e1bc-4aa4-9785-243b8df2dbcb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/586582bb-e1bc-4aa4-9785-243b8df2dbcb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.823 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[ba6503c1-6a77-4405-a3ce-30aa148249dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.825 143326 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: global
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]:     log         /dev/log local0 debug
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]:     log-tag     haproxy-metadata-proxy-586582bb-e1bc-4aa4-9785-243b8df2dbcb
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]:     user        root
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]:     group       root
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]:     maxconn     1024
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]:     pidfile     /var/lib/neutron/external/pids/586582bb-e1bc-4aa4-9785-243b8df2dbcb.pid.haproxy
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]:     daemon
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: defaults
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]:     log global
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]:     mode http
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]:     option httplog
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]:     option dontlognull
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]:     option http-server-close
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]:     option forwardfor
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]:     retries                 3
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]:     timeout http-request    30s
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]:     timeout connect         30s
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]:     timeout client          32s
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]:     timeout server          32s
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]:     timeout http-keep-alive 30s
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: listen listener
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]:     bind 169.254.169.254:80
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]:     http-request add-header X-OVN-Network-ID 586582bb-e1bc-4aa4-9785-243b8df2dbcb
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 26 10:09:56 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.829 143326 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb', 'env', 'PROCESS_TAG=haproxy-586582bb-e1bc-4aa4-9785-243b8df2dbcb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/586582bb-e1bc-4aa4-9785-243b8df2dbcb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 26 10:09:56 compute-1 nova_compute[226322]: 2026-01-26 10:09:56.918 226326 DEBUG nova.compute.manager [req-30538185-e811-4e35-9b27-8e46b84690c8 req-25d17654-f4df-4980-919c-36800ab7912d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Received event network-vif-plugged-a3edb0bf-fece-4486-bde7-200b47a3207f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:09:56 compute-1 nova_compute[226322]: 2026-01-26 10:09:56.918 226326 DEBUG oslo_concurrency.lockutils [req-30538185-e811-4e35-9b27-8e46b84690c8 req-25d17654-f4df-4980-919c-36800ab7912d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:09:56 compute-1 nova_compute[226322]: 2026-01-26 10:09:56.919 226326 DEBUG oslo_concurrency.lockutils [req-30538185-e811-4e35-9b27-8e46b84690c8 req-25d17654-f4df-4980-919c-36800ab7912d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:09:56 compute-1 nova_compute[226322]: 2026-01-26 10:09:56.919 226326 DEBUG oslo_concurrency.lockutils [req-30538185-e811-4e35-9b27-8e46b84690c8 req-25d17654-f4df-4980-919c-36800ab7912d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:09:56 compute-1 nova_compute[226322]: 2026-01-26 10:09:56.919 226326 DEBUG nova.compute.manager [req-30538185-e811-4e35-9b27-8e46b84690c8 req-25d17654-f4df-4980-919c-36800ab7912d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Processing event network-vif-plugged-a3edb0bf-fece-4486-bde7-200b47a3207f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 26 10:09:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:57.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:57.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:57 compute-1 podman[232049]: 2026-01-26 10:09:57.249742871 +0000 UTC m=+0.053873667 container create 6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.274 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422197.2742188, eafb89b3-bd76-4bd7-9ba0-c169e9e02d17 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.275 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] VM Started (Lifecycle Event)
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.279 226326 DEBUG nova.compute.manager [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 26 10:09:57 compute-1 systemd[1]: Started libpod-conmon-6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705.scope.
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.283 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.286 226326 INFO nova.virt.libvirt.driver [-] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Instance spawned successfully.
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.286 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 26 10:09:57 compute-1 systemd[1]: Started libcrun container.
Jan 26 10:09:57 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3d3f80d88a6c2d2bff340e38e70bb3d56800b97343639f380b5d8e3058e1a4a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 10:09:57 compute-1 podman[232049]: 2026-01-26 10:09:57.222156006 +0000 UTC m=+0.026286852 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 26 10:09:57 compute-1 podman[232049]: 2026-01-26 10:09:57.325909545 +0000 UTC m=+0.130040371 container init 6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.328 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 10:09:57 compute-1 podman[232049]: 2026-01-26 10:09:57.331470821 +0000 UTC m=+0.135601617 container start 6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.334 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.337 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.337 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.337 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.338 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.338 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.339 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:09:57 compute-1 neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb[232069]: [NOTICE]   (232073) : New worker (232075) forked
Jan 26 10:09:57 compute-1 neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb[232069]: [NOTICE]   (232073) : Loading success.
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.389 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.390 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422197.2744265, eafb89b3-bd76-4bd7-9ba0-c169e9e02d17 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.390 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] VM Paused (Lifecycle Event)
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.425 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.429 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422197.281973, eafb89b3-bd76-4bd7-9ba0-c169e9e02d17 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.429 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] VM Resumed (Lifecycle Event)
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.437 226326 INFO nova.compute.manager [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Took 9.42 seconds to spawn the instance on the hypervisor.
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.437 226326 DEBUG nova.compute.manager [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.484 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.488 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.525 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.537 226326 INFO nova.compute.manager [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Took 10.37 seconds to build instance.
Jan 26 10:09:57 compute-1 nova_compute[226322]: 2026-01-26 10:09:57.557 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.462s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:09:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 10:09:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1722280942' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:09:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 10:09:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1722280942' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:09:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:09:58 compute-1 ceph-mon[80107]: pgmap v832: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 26 10:09:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/1722280942' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:09:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/1722280942' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:09:58 compute-1 nova_compute[226322]: 2026-01-26 10:09:58.998 226326 DEBUG nova.compute.manager [req-0e11bfb7-d2b0-45d4-be60-bbac55617caf req-f77ed1b6-72ce-4b00-93a5-386c536279d3 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Received event network-vif-plugged-a3edb0bf-fece-4486-bde7-200b47a3207f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:09:58 compute-1 nova_compute[226322]: 2026-01-26 10:09:58.998 226326 DEBUG oslo_concurrency.lockutils [req-0e11bfb7-d2b0-45d4-be60-bbac55617caf req-f77ed1b6-72ce-4b00-93a5-386c536279d3 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:09:58 compute-1 nova_compute[226322]: 2026-01-26 10:09:58.999 226326 DEBUG oslo_concurrency.lockutils [req-0e11bfb7-d2b0-45d4-be60-bbac55617caf req-f77ed1b6-72ce-4b00-93a5-386c536279d3 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:09:58 compute-1 nova_compute[226322]: 2026-01-26 10:09:58.999 226326 DEBUG oslo_concurrency.lockutils [req-0e11bfb7-d2b0-45d4-be60-bbac55617caf req-f77ed1b6-72ce-4b00-93a5-386c536279d3 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:09:58 compute-1 nova_compute[226322]: 2026-01-26 10:09:58.999 226326 DEBUG nova.compute.manager [req-0e11bfb7-d2b0-45d4-be60-bbac55617caf req-f77ed1b6-72ce-4b00-93a5-386c536279d3 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] No waiting events found dispatching network-vif-plugged-a3edb0bf-fece-4486-bde7-200b47a3207f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 10:09:58 compute-1 nova_compute[226322]: 2026-01-26 10:09:58.999 226326 WARNING nova.compute.manager [req-0e11bfb7-d2b0-45d4-be60-bbac55617caf req-f77ed1b6-72ce-4b00-93a5-386c536279d3 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Received unexpected event network-vif-plugged-a3edb0bf-fece-4486-bde7-200b47a3207f for instance with vm_state active and task_state None.
Jan 26 10:09:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:09:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:09:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:09:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:59 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:09:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:59.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:09:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:09:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:59.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:09:59 compute-1 nova_compute[226322]: 2026-01-26 10:09:59.680 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:00 compute-1 nova_compute[226322]: 2026-01-26 10:10:00.002 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:00 compute-1 ceph-mon[80107]: pgmap v833: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 26 10:10:00 compute-1 ceph-mon[80107]: Health detail: HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Jan 26 10:10:00 compute-1 ceph-mon[80107]: [WRN] BLUESTORE_SLOW_OP_ALERT: 1 OSD(s) experiencing slow operations in BlueStore
Jan 26 10:10:00 compute-1 ceph-mon[80107]:      osd.2 observed slow operation indications in BlueStore
Jan 26 10:10:00 compute-1 ovn_controller[133670]: 2026-01-26T10:10:00Z|00042|binding|INFO|Releasing lport e7d448a7-b7a2-4f6b-a25b-09ac0981ee0c from this chassis (sb_readonly=0)
Jan 26 10:10:00 compute-1 NetworkManager[49073]: <info>  [1769422200.9570] manager: (patch-br-int-to-provnet-94d9950f-5cf2-4813-9455-dd14377245f4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Jan 26 10:10:00 compute-1 NetworkManager[49073]: <info>  [1769422200.9578] manager: (patch-provnet-94d9950f-5cf2-4813-9455-dd14377245f4-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Jan 26 10:10:00 compute-1 nova_compute[226322]: 2026-01-26 10:10:00.959 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:00 compute-1 nova_compute[226322]: 2026-01-26 10:10:00.989 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:00 compute-1 ovn_controller[133670]: 2026-01-26T10:10:00Z|00043|binding|INFO|Releasing lport e7d448a7-b7a2-4f6b-a25b-09ac0981ee0c from this chassis (sb_readonly=0)
Jan 26 10:10:00 compute-1 nova_compute[226322]: 2026-01-26 10:10:00.995 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:01.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:01.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:01 compute-1 nova_compute[226322]: 2026-01-26 10:10:01.394 226326 DEBUG nova.compute.manager [req-5c5f7af9-fa47-4f8f-8992-89a887ee43d4 req-f03b095c-14b2-401c-9ca6-5785a3e4fdb9 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Received event network-changed-a3edb0bf-fece-4486-bde7-200b47a3207f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:10:01 compute-1 nova_compute[226322]: 2026-01-26 10:10:01.394 226326 DEBUG nova.compute.manager [req-5c5f7af9-fa47-4f8f-8992-89a887ee43d4 req-f03b095c-14b2-401c-9ca6-5785a3e4fdb9 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Refreshing instance network info cache due to event network-changed-a3edb0bf-fece-4486-bde7-200b47a3207f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 10:10:01 compute-1 nova_compute[226322]: 2026-01-26 10:10:01.395 226326 DEBUG oslo_concurrency.lockutils [req-5c5f7af9-fa47-4f8f-8992-89a887ee43d4 req-f03b095c-14b2-401c-9ca6-5785a3e4fdb9 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 10:10:01 compute-1 nova_compute[226322]: 2026-01-26 10:10:01.395 226326 DEBUG oslo_concurrency.lockutils [req-5c5f7af9-fa47-4f8f-8992-89a887ee43d4 req-f03b095c-14b2-401c-9ca6-5785a3e4fdb9 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquired lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 10:10:01 compute-1 nova_compute[226322]: 2026-01-26 10:10:01.395 226326 DEBUG nova.network.neutron [req-5c5f7af9-fa47-4f8f-8992-89a887ee43d4 req-f03b095c-14b2-401c-9ca6-5785a3e4fdb9 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Refreshing network info cache for port a3edb0bf-fece-4486-bde7-200b47a3207f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 10:10:02 compute-1 nova_compute[226322]: 2026-01-26 10:10:02.302 226326 DEBUG nova.network.neutron [req-5c5f7af9-fa47-4f8f-8992-89a887ee43d4 req-f03b095c-14b2-401c-9ca6-5785a3e4fdb9 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Updated VIF entry in instance network info cache for port a3edb0bf-fece-4486-bde7-200b47a3207f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 10:10:02 compute-1 nova_compute[226322]: 2026-01-26 10:10:02.303 226326 DEBUG nova.network.neutron [req-5c5f7af9-fa47-4f8f-8992-89a887ee43d4 req-f03b095c-14b2-401c-9ca6-5785a3e4fdb9 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Updating instance_info_cache with network_info: [{"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:10:02 compute-1 nova_compute[226322]: 2026-01-26 10:10:02.322 226326 DEBUG oslo_concurrency.lockutils [req-5c5f7af9-fa47-4f8f-8992-89a887ee43d4 req-f03b095c-14b2-401c-9ca6-5785a3e4fdb9 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Releasing lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 10:10:02 compute-1 podman[232088]: 2026-01-26 10:10:02.336011912 +0000 UTC m=+0.100097673 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 10:10:02 compute-1 sshd-session[232116]: Invalid user centos from 104.248.198.71 port 49982
Jan 26 10:10:02 compute-1 ceph-mon[80107]: pgmap v834: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Jan 26 10:10:03 compute-1 sshd-session[232116]: Connection closed by invalid user centos 104.248.198.71 port 49982 [preauth]
Jan 26 10:10:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:10:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:03.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:10:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:03.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:03 compute-1 sudo[232118]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:10:03 compute-1 sudo[232118]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:10:03 compute-1 sudo[232118]: pam_unix(sudo:session): session closed for user root
Jan 26 10:10:03 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:10:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:10:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:10:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:10:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:04 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:10:04 compute-1 ceph-mon[80107]: pgmap v835: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 26 10:10:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:10:04 compute-1 nova_compute[226322]: 2026-01-26 10:10:04.681 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:05 compute-1 nova_compute[226322]: 2026-01-26 10:10:05.004 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:10:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:05.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:10:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:10:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:05.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:10:06 compute-1 ceph-mon[80107]: pgmap v836: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 26 10:10:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:10:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:07.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:10:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:10:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:07.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:10:08 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:10:08 compute-1 ceph-mon[80107]: pgmap v837: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 26 10:10:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:10:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:10:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:10:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:09 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:10:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:10:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:09.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:10:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:09.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:09 compute-1 nova_compute[226322]: 2026-01-26 10:10:09.685 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:10 compute-1 nova_compute[226322]: 2026-01-26 10:10:10.006 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:10 compute-1 ceph-mon[80107]: pgmap v838: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 26 10:10:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:11.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:10:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:11.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:10:11 compute-1 ovn_controller[133670]: 2026-01-26T10:10:11Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:93:05:f9 10.100.0.6
Jan 26 10:10:11 compute-1 ovn_controller[133670]: 2026-01-26T10:10:11Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:93:05:f9 10.100.0.6
Jan 26 10:10:12 compute-1 ceph-mon[80107]: pgmap v839: 353 pgs: 353 active+clean; 113 MiB data, 305 MiB used, 60 GiB / 60 GiB avail; 2.1 MiB/s rd, 2.1 MiB/s wr, 119 op/s
Jan 26 10:10:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000052s ======
Jan 26 10:10:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:13.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000052s
Jan 26 10:10:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:10:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:13.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:10:13 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:10:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:10:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:10:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:10:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:14 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:10:14 compute-1 nova_compute[226322]: 2026-01-26 10:10:14.686 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:15 compute-1 nova_compute[226322]: 2026-01-26 10:10:15.007 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:10:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:15.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:10:15 compute-1 ceph-mon[80107]: pgmap v840: 353 pgs: 353 active+clean; 113 MiB data, 305 MiB used, 60 GiB / 60 GiB avail; 210 KiB/s rd, 2.0 MiB/s wr, 45 op/s
Jan 26 10:10:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:15.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:15 compute-1 nova_compute[226322]: 2026-01-26 10:10:15.869 226326 INFO nova.compute.manager [None req-91a8e286-9707-48a9-bb85-8598d8c787c1 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Get console output
Jan 26 10:10:15 compute-1 nova_compute[226322]: 2026-01-26 10:10:15.877 230154 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 10:10:16 compute-1 ceph-mon[80107]: pgmap v841: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 270 KiB/s rd, 2.1 MiB/s wr, 59 op/s
Jan 26 10:10:16 compute-1 podman[232151]: 2026-01-26 10:10:16.312168071 +0000 UTC m=+0.082262374 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 26 10:10:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:17.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:17.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:18 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:10:18 compute-1 ceph-mon[80107]: pgmap v842: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 269 KiB/s rd, 2.1 MiB/s wr, 59 op/s
Jan 26 10:10:18 compute-1 nova_compute[226322]: 2026-01-26 10:10:18.945 226326 DEBUG nova.compute.manager [req-356e8179-d484-43ca-8394-75bc85c19554 req-b5c4e284-279c-4714-baaf-46ff65eb7a3d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Received event network-changed-a3edb0bf-fece-4486-bde7-200b47a3207f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:10:18 compute-1 nova_compute[226322]: 2026-01-26 10:10:18.945 226326 DEBUG nova.compute.manager [req-356e8179-d484-43ca-8394-75bc85c19554 req-b5c4e284-279c-4714-baaf-46ff65eb7a3d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Refreshing instance network info cache due to event network-changed-a3edb0bf-fece-4486-bde7-200b47a3207f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 10:10:18 compute-1 nova_compute[226322]: 2026-01-26 10:10:18.946 226326 DEBUG oslo_concurrency.lockutils [req-356e8179-d484-43ca-8394-75bc85c19554 req-b5c4e284-279c-4714-baaf-46ff65eb7a3d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 10:10:18 compute-1 nova_compute[226322]: 2026-01-26 10:10:18.946 226326 DEBUG oslo_concurrency.lockutils [req-356e8179-d484-43ca-8394-75bc85c19554 req-b5c4e284-279c-4714-baaf-46ff65eb7a3d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquired lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 10:10:18 compute-1 nova_compute[226322]: 2026-01-26 10:10:18.946 226326 DEBUG nova.network.neutron [req-356e8179-d484-43ca-8394-75bc85c19554 req-b5c4e284-279c-4714-baaf-46ff65eb7a3d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Refreshing network info cache for port a3edb0bf-fece-4486-bde7-200b47a3207f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 10:10:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:10:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:10:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:10:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:10:19 compute-1 nova_compute[226322]: 2026-01-26 10:10:19.060 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:19 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:10:19.060 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 10:10:19 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:10:19.061 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 10:10:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:19.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:19.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:19 compute-1 nova_compute[226322]: 2026-01-26 10:10:19.689 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:10:20 compute-1 nova_compute[226322]: 2026-01-26 10:10:20.008 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:20 compute-1 nova_compute[226322]: 2026-01-26 10:10:20.752 226326 DEBUG nova.network.neutron [req-356e8179-d484-43ca-8394-75bc85c19554 req-b5c4e284-279c-4714-baaf-46ff65eb7a3d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Updated VIF entry in instance network info cache for port a3edb0bf-fece-4486-bde7-200b47a3207f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 10:10:20 compute-1 nova_compute[226322]: 2026-01-26 10:10:20.752 226326 DEBUG nova.network.neutron [req-356e8179-d484-43ca-8394-75bc85c19554 req-b5c4e284-279c-4714-baaf-46ff65eb7a3d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Updating instance_info_cache with network_info: [{"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:10:20 compute-1 nova_compute[226322]: 2026-01-26 10:10:20.771 226326 DEBUG oslo_concurrency.lockutils [req-356e8179-d484-43ca-8394-75bc85c19554 req-b5c4e284-279c-4714-baaf-46ff65eb7a3d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Releasing lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 10:10:20 compute-1 ceph-mon[80107]: pgmap v843: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 269 KiB/s rd, 2.1 MiB/s wr, 59 op/s
Jan 26 10:10:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:21.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:21.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:21 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 10:10:21 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.1 total, 600.0 interval
                                           Cumulative writes: 8201 writes, 31K keys, 8201 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 8201 writes, 1912 syncs, 4.29 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1363 writes, 4432 keys, 1363 commit groups, 1.0 writes per commit group, ingest: 4.48 MB, 0.01 MB/s
                                           Interval WAL: 1363 writes, 582 syncs, 2.34 writes per sync, written: 0.00 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 26 10:10:22 compute-1 ceph-mon[80107]: pgmap v844: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 275 KiB/s rd, 2.1 MiB/s wr, 60 op/s
Jan 26 10:10:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:23.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:23.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:23 compute-1 sudo[232175]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:10:23 compute-1 sudo[232175]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:10:23 compute-1 sudo[232175]: pam_unix(sudo:session): session closed for user root
Jan 26 10:10:23 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:10:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:10:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:10:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:10:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:10:24 compute-1 ceph-mon[80107]: pgmap v845: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 66 KiB/s rd, 103 KiB/s wr, 15 op/s
Jan 26 10:10:24 compute-1 nova_compute[226322]: 2026-01-26 10:10:24.692 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:25 compute-1 nova_compute[226322]: 2026-01-26 10:10:25.009 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:25.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:25.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:26 compute-1 ceph-mon[80107]: pgmap v846: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 66 KiB/s rd, 106 KiB/s wr, 15 op/s
Jan 26 10:10:26 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/4287714874' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:10:27 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:10:27.063 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:10:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:27.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:10:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:27.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:10:28 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:10:28 compute-1 ceph-mon[80107]: pgmap v847: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 6.0 KiB/s rd, 16 KiB/s wr, 1 op/s
Jan 26 10:10:28 compute-1 nova_compute[226322]: 2026-01-26 10:10:28.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:10:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:10:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:29 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:10:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:29 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:10:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:29 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:10:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:10:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:29.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:10:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:29.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:29 compute-1 nova_compute[226322]: 2026-01-26 10:10:29.694 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:30 compute-1 nova_compute[226322]: 2026-01-26 10:10:30.010 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:30 compute-1 sshd-session[232203]: Connection closed by authenticating user root 178.62.249.31 port 45502 [preauth]
Jan 26 10:10:30 compute-1 nova_compute[226322]: 2026-01-26 10:10:30.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:10:30 compute-1 nova_compute[226322]: 2026-01-26 10:10:30.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:10:30 compute-1 nova_compute[226322]: 2026-01-26 10:10:30.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:10:30 compute-1 ceph-mon[80107]: pgmap v848: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 6.0 KiB/s rd, 16 KiB/s wr, 1 op/s
Jan 26 10:10:30 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3449492847' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:10:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:31.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:10:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:31.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:10:31 compute-1 nova_compute[226322]: 2026-01-26 10:10:31.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:10:31 compute-1 nova_compute[226322]: 2026-01-26 10:10:31.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:10:31 compute-1 nova_compute[226322]: 2026-01-26 10:10:31.685 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:10:31 compute-1 nova_compute[226322]: 2026-01-26 10:10:31.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:10:31 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/4146777078' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:10:31 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1769423089' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:10:32 compute-1 sshd-session[232207]: Invalid user admin from 152.42.136.110 port 36978
Jan 26 10:10:32 compute-1 nova_compute[226322]: 2026-01-26 10:10:32.135 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 10:10:32 compute-1 nova_compute[226322]: 2026-01-26 10:10:32.136 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquired lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 10:10:32 compute-1 nova_compute[226322]: 2026-01-26 10:10:32.136 226326 DEBUG nova.network.neutron [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 26 10:10:32 compute-1 nova_compute[226322]: 2026-01-26 10:10:32.136 226326 DEBUG nova.objects.instance [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid eafb89b3-bd76-4bd7-9ba0-c169e9e02d17 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 10:10:32 compute-1 sshd-session[232207]: Connection closed by invalid user admin 152.42.136.110 port 36978 [preauth]
Jan 26 10:10:33 compute-1 ceph-mon[80107]: pgmap v849: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 23 KiB/s rd, 1.8 MiB/s wr, 29 op/s
Jan 26 10:10:33 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2218755935' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:10:33 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1680251038' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:10:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:33.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:33.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:33 compute-1 podman[232209]: 2026-01-26 10:10:33.362465196 +0000 UTC m=+0.139902691 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 26 10:10:33 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:10:33 compute-1 nova_compute[226322]: 2026-01-26 10:10:33.625 226326 DEBUG nova.network.neutron [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Updating instance_info_cache with network_info: [{"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:10:33 compute-1 nova_compute[226322]: 2026-01-26 10:10:33.640 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Releasing lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 10:10:33 compute-1 nova_compute[226322]: 2026-01-26 10:10:33.640 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 26 10:10:33 compute-1 nova_compute[226322]: 2026-01-26 10:10:33.641 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:10:33 compute-1 nova_compute[226322]: 2026-01-26 10:10:33.641 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:10:33 compute-1 nova_compute[226322]: 2026-01-26 10:10:33.641 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:10:33 compute-1 nova_compute[226322]: 2026-01-26 10:10:33.660 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:10:33 compute-1 nova_compute[226322]: 2026-01-26 10:10:33.661 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:10:33 compute-1 nova_compute[226322]: 2026-01-26 10:10:33.661 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:10:33 compute-1 nova_compute[226322]: 2026-01-26 10:10:33.662 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:10:33 compute-1 nova_compute[226322]: 2026-01-26 10:10:33.662 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:10:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:10:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:10:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:10:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:34 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:10:34 compute-1 ceph-mon[80107]: pgmap v850: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Jan 26 10:10:34 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2371929807' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:10:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:10:34 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:10:34 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2211083446' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:10:34 compute-1 nova_compute[226322]: 2026-01-26 10:10:34.174 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.512s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:10:34 compute-1 nova_compute[226322]: 2026-01-26 10:10:34.256 226326 DEBUG nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] skipping disk for instance-00000004 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Jan 26 10:10:34 compute-1 nova_compute[226322]: 2026-01-26 10:10:34.257 226326 DEBUG nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] skipping disk for instance-00000004 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Jan 26 10:10:34 compute-1 nova_compute[226322]: 2026-01-26 10:10:34.384 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:10:34 compute-1 nova_compute[226322]: 2026-01-26 10:10:34.385 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4711MB free_disk=59.92194747924805GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:10:34 compute-1 nova_compute[226322]: 2026-01-26 10:10:34.385 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:10:34 compute-1 nova_compute[226322]: 2026-01-26 10:10:34.385 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:10:34 compute-1 nova_compute[226322]: 2026-01-26 10:10:34.471 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Instance eafb89b3-bd76-4bd7-9ba0-c169e9e02d17 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 10:10:34 compute-1 nova_compute[226322]: 2026-01-26 10:10:34.471 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:10:34 compute-1 nova_compute[226322]: 2026-01-26 10:10:34.471 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:10:34 compute-1 nova_compute[226322]: 2026-01-26 10:10:34.522 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:10:34 compute-1 nova_compute[226322]: 2026-01-26 10:10:34.696 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:34 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:10:34 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/577859797' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:10:34 compute-1 nova_compute[226322]: 2026-01-26 10:10:34.972 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:10:34 compute-1 nova_compute[226322]: 2026-01-26 10:10:34.979 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:10:35 compute-1 nova_compute[226322]: 2026-01-26 10:10:35.000 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:10:35 compute-1 nova_compute[226322]: 2026-01-26 10:10:35.011 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:35 compute-1 nova_compute[226322]: 2026-01-26 10:10:35.025 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:10:35 compute-1 nova_compute[226322]: 2026-01-26 10:10:35.025 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.640s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:10:35 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2211083446' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:10:35 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/577859797' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:10:35 compute-1 nova_compute[226322]: 2026-01-26 10:10:35.071 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:10:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:10:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:35.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:10:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000052s ======
Jan 26 10:10:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:35.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000052s
Jan 26 10:10:36 compute-1 ceph-mon[80107]: pgmap v851: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 100 op/s
Jan 26 10:10:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:37.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:10:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:37.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:10:38 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:10:38 compute-1 ceph-mon[80107]: pgmap v852: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 99 op/s
Jan 26 10:10:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:10:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:10:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:10:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:39 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:10:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:39.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:39.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:39 compute-1 nova_compute[226322]: 2026-01-26 10:10:39.698 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:40 compute-1 nova_compute[226322]: 2026-01-26 10:10:40.013 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:40 compute-1 ceph-mon[80107]: pgmap v853: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 99 op/s
Jan 26 10:10:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:41.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:41.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:42 compute-1 ceph-mon[80107]: pgmap v854: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Jan 26 10:10:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:10:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:43.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:10:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:43.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:43 compute-1 sudo[232285]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:10:43 compute-1 sudo[232285]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:10:43 compute-1 sudo[232285]: pam_unix(sudo:session): session closed for user root
Jan 26 10:10:43 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:10:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:10:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:10:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:10:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:44 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:10:44 compute-1 nova_compute[226322]: 2026-01-26 10:10:44.700 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:44 compute-1 ceph-mon[80107]: pgmap v855: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 26 10:10:45 compute-1 nova_compute[226322]: 2026-01-26 10:10:45.015 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:45.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:10:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:45.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:10:46 compute-1 ceph-mon[80107]: pgmap v856: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 15 KiB/s wr, 76 op/s
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.000800) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422247000823, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2365, "num_deletes": 251, "total_data_size": 6250910, "memory_usage": 6354824, "flush_reason": "Manual Compaction"}
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422247022013, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 4034793, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 26008, "largest_seqno": 28367, "table_properties": {"data_size": 4025507, "index_size": 5780, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19611, "raw_average_key_size": 20, "raw_value_size": 4006696, "raw_average_value_size": 4134, "num_data_blocks": 254, "num_entries": 969, "num_filter_entries": 969, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769422036, "oldest_key_time": 1769422036, "file_creation_time": 1769422246, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 21326 microseconds, and 7878 cpu microseconds.
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.022122) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 4034793 bytes OK
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.022165) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.030049) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.030067) EVENT_LOG_v1 {"time_micros": 1769422247030063, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.030084) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 6240512, prev total WAL file size 6240512, number of live WAL files 2.
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.031674) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032303038' seq:72057594037927935, type:22 .. '7061786F730032323630' seq:0, type:0; will stop at (end)
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(3940KB)], [51(12MB)]
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422247031827, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 16667635, "oldest_snapshot_seqno": -1}
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 5823 keys, 14554152 bytes, temperature: kUnknown
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422247148627, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 14554152, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14514377, "index_size": 24113, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14597, "raw_key_size": 147983, "raw_average_key_size": 25, "raw_value_size": 14408282, "raw_average_value_size": 2474, "num_data_blocks": 984, "num_entries": 5823, "num_filter_entries": 5823, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769422247, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.149008) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 14554152 bytes
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.150383) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 142.5 rd, 124.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.8, 12.0 +0.0 blob) out(13.9 +0.0 blob), read-write-amplify(7.7) write-amplify(3.6) OK, records in: 6339, records dropped: 516 output_compression: NoCompression
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.150402) EVENT_LOG_v1 {"time_micros": 1769422247150393, "job": 30, "event": "compaction_finished", "compaction_time_micros": 116951, "compaction_time_cpu_micros": 53158, "output_level": 6, "num_output_files": 1, "total_output_size": 14554152, "num_input_records": 6339, "num_output_records": 5823, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422247151592, "job": 30, "event": "table_file_deletion", "file_number": 53}
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422247154415, "job": 30, "event": "table_file_deletion", "file_number": 51}
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.031526) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.154525) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.154532) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.154535) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.154538) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:10:47 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.154541) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:10:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:47.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:10:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:47.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:10:47 compute-1 podman[232312]: 2026-01-26 10:10:47.291745691 +0000 UTC m=+0.066122020 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 10:10:48 compute-1 ceph-mon[80107]: pgmap v857: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 69 KiB/s rd, 2.7 KiB/s wr, 4 op/s
Jan 26 10:10:48 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:10:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:10:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:10:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:10:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:49 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:10:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:10:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:49.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:49.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:49 compute-1 sshd-session[232333]: Invalid user centos from 104.248.198.71 port 58794
Jan 26 10:10:49 compute-1 sshd-session[232333]: Connection closed by invalid user centos 104.248.198.71 port 58794 [preauth]
Jan 26 10:10:49 compute-1 nova_compute[226322]: 2026-01-26 10:10:49.703 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:50 compute-1 nova_compute[226322]: 2026-01-26 10:10:50.017 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:50 compute-1 ceph-mon[80107]: pgmap v858: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 69 KiB/s rd, 2.7 KiB/s wr, 4 op/s
Jan 26 10:10:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:10:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:51.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:10:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:51.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:52 compute-1 ceph-mon[80107]: pgmap v859: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 457 KiB/s rd, 2.1 MiB/s wr, 69 op/s
Jan 26 10:10:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:10:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:53.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:10:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:10:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:53.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:10:53 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:10:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:10:53.935 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:10:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:10:53.936 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:10:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:10:53.936 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:10:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:54 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:10:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:54 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:10:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:54 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:10:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:54 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:10:54 compute-1 ceph-mon[80107]: pgmap v860: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 391 KiB/s rd, 2.1 MiB/s wr, 67 op/s
Jan 26 10:10:54 compute-1 nova_compute[226322]: 2026-01-26 10:10:54.734 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:54 compute-1 sudo[232338]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:10:54 compute-1 sudo[232338]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:10:54 compute-1 sudo[232338]: pam_unix(sudo:session): session closed for user root
Jan 26 10:10:54 compute-1 sudo[232363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Jan 26 10:10:54 compute-1 sudo[232363]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:10:55 compute-1 nova_compute[226322]: 2026-01-26 10:10:55.019 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:10:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:55.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:10:55 compute-1 sudo[232363]: pam_unix(sudo:session): session closed for user root
Jan 26 10:10:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:55.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:55 compute-1 sudo[232409]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:10:55 compute-1 sudo[232409]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:10:55 compute-1 sudo[232409]: pam_unix(sudo:session): session closed for user root
Jan 26 10:10:55 compute-1 sudo[232434]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:10:55 compute-1 sudo[232434]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:10:55 compute-1 sudo[232434]: pam_unix(sudo:session): session closed for user root
Jan 26 10:10:56 compute-1 sudo[232493]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:10:56 compute-1 sudo[232493]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:10:56 compute-1 sudo[232493]: pam_unix(sudo:session): session closed for user root
Jan 26 10:10:56 compute-1 sudo[232518]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 1a70b85d-e3fd-5814-8a6a-37ea00fcae30 -- inventory --format=json-pretty --filter-for-batch
Jan 26 10:10:56 compute-1 sudo[232518]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:10:56 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:10:56 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:10:56 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:10:56 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:10:56 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2141004236' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:10:56 compute-1 ceph-mon[80107]: pgmap v861: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 411 KiB/s rd, 2.1 MiB/s wr, 96 op/s
Jan 26 10:10:56 compute-1 podman[232582]: 2026-01-26 10:10:56.465827683 +0000 UTC m=+0.031420687 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 10:10:56 compute-1 podman[232582]: 2026-01-26 10:10:56.56684155 +0000 UTC m=+0.132434534 container create 340ee6cd798e2b37f6584634f872fdceb3b8e8b15ef1001bd559aa2f92cdfc7c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=mystifying_chaum, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 10:10:56 compute-1 systemd[1]: Started libpod-conmon-340ee6cd798e2b37f6584634f872fdceb3b8e8b15ef1001bd559aa2f92cdfc7c.scope.
Jan 26 10:10:56 compute-1 systemd[1]: Started libcrun container.
Jan 26 10:10:56 compute-1 podman[232582]: 2026-01-26 10:10:56.669294405 +0000 UTC m=+0.234887409 container init 340ee6cd798e2b37f6584634f872fdceb3b8e8b15ef1001bd559aa2f92cdfc7c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=mystifying_chaum, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 10:10:56 compute-1 podman[232582]: 2026-01-26 10:10:56.681172487 +0000 UTC m=+0.246765471 container start 340ee6cd798e2b37f6584634f872fdceb3b8e8b15ef1001bd559aa2f92cdfc7c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=mystifying_chaum, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 10:10:56 compute-1 podman[232582]: 2026-01-26 10:10:56.686174669 +0000 UTC m=+0.251767653 container attach 340ee6cd798e2b37f6584634f872fdceb3b8e8b15ef1001bd559aa2f92cdfc7c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=mystifying_chaum, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Jan 26 10:10:56 compute-1 systemd[1]: libpod-340ee6cd798e2b37f6584634f872fdceb3b8e8b15ef1001bd559aa2f92cdfc7c.scope: Deactivated successfully.
Jan 26 10:10:56 compute-1 mystifying_chaum[232598]: 167 167
Jan 26 10:10:56 compute-1 podman[232582]: 2026-01-26 10:10:56.690345328 +0000 UTC m=+0.255938322 container died 340ee6cd798e2b37f6584634f872fdceb3b8e8b15ef1001bd559aa2f92cdfc7c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=mystifying_chaum, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 10:10:56 compute-1 conmon[232598]: conmon 340ee6cd798e2b37f658 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-340ee6cd798e2b37f6584634f872fdceb3b8e8b15ef1001bd559aa2f92cdfc7c.scope/container/memory.events
Jan 26 10:10:56 compute-1 systemd[1]: var-lib-containers-storage-overlay-a496413aa491a69d954bb122faa92dee8a50cf1ec283f3a879edccad5e1eb4bd-merged.mount: Deactivated successfully.
Jan 26 10:10:56 compute-1 podman[232582]: 2026-01-26 10:10:56.733070782 +0000 UTC m=+0.298663766 container remove 340ee6cd798e2b37f6584634f872fdceb3b8e8b15ef1001bd559aa2f92cdfc7c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=mystifying_chaum, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 10:10:56 compute-1 systemd[1]: libpod-conmon-340ee6cd798e2b37f6584634f872fdceb3b8e8b15ef1001bd559aa2f92cdfc7c.scope: Deactivated successfully.
Jan 26 10:10:56 compute-1 podman[232622]: 2026-01-26 10:10:56.93717642 +0000 UTC m=+0.063103480 container create c332c6308b59c020b68c7df7148137ea27f67c5d6be2aa9e4f16155cefe52c66 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_lederberg, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 10:10:56 compute-1 systemd[1]: Started libpod-conmon-c332c6308b59c020b68c7df7148137ea27f67c5d6be2aa9e4f16155cefe52c66.scope.
Jan 26 10:10:57 compute-1 podman[232622]: 2026-01-26 10:10:56.909444691 +0000 UTC m=+0.035371851 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 10:10:57 compute-1 systemd[1]: Started libcrun container.
Jan 26 10:10:57 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a50efcbbac7411183286ce8649f8495c8917c72edcff13946694a5932b1ab55e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 10:10:57 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a50efcbbac7411183286ce8649f8495c8917c72edcff13946694a5932b1ab55e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 10:10:57 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a50efcbbac7411183286ce8649f8495c8917c72edcff13946694a5932b1ab55e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 10:10:57 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a50efcbbac7411183286ce8649f8495c8917c72edcff13946694a5932b1ab55e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 10:10:57 compute-1 podman[232622]: 2026-01-26 10:10:57.033195386 +0000 UTC m=+0.159122486 container init c332c6308b59c020b68c7df7148137ea27f67c5d6be2aa9e4f16155cefe52c66 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_lederberg, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, OSD_FLAVOR=default)
Jan 26 10:10:57 compute-1 podman[232622]: 2026-01-26 10:10:57.039970624 +0000 UTC m=+0.165897694 container start c332c6308b59c020b68c7df7148137ea27f67c5d6be2aa9e4f16155cefe52c66 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_lederberg, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Jan 26 10:10:57 compute-1 podman[232622]: 2026-01-26 10:10:57.043340993 +0000 UTC m=+0.169268083 container attach c332c6308b59c020b68c7df7148137ea27f67c5d6be2aa9e4f16155cefe52c66 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_lederberg, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid)
Jan 26 10:10:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:57.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:57.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:57 compute-1 sshd-session[232404]: Connection reset by authenticating user root 176.120.22.13 port 29598 [preauth]
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]: [
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:     {
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:         "available": false,
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:         "being_replaced": false,
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:         "ceph_device_lvm": false,
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:         "device_id": "QEMU_DVD-ROM_QM00001",
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:         "lsm_data": {},
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:         "lvs": [],
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:         "path": "/dev/sr0",
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:         "rejected_reasons": [
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             "Has a FileSystem",
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             "Insufficient space (<5GB)"
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:         ],
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:         "sys_api": {
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             "actuators": null,
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             "device_nodes": [
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:                 "sr0"
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             ],
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             "devname": "sr0",
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             "human_readable_size": "482.00 KB",
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             "id_bus": "ata",
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             "model": "QEMU DVD-ROM",
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             "nr_requests": "2",
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             "parent": "/dev/sr0",
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             "partitions": {},
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             "path": "/dev/sr0",
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             "removable": "1",
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             "rev": "2.5+",
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             "ro": "0",
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             "rotational": "1",
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             "sas_address": "",
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             "sas_device_handle": "",
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             "scheduler_mode": "mq-deadline",
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             "sectors": 0,
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             "sectorsize": "2048",
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             "size": 493568.0,
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             "support_discard": "2048",
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             "type": "disk",
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:             "vendor": "QEMU"
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:         }
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]:     }
Jan 26 10:10:57 compute-1 goofy_lederberg[232638]: ]
Jan 26 10:10:57 compute-1 systemd[1]: libpod-c332c6308b59c020b68c7df7148137ea27f67c5d6be2aa9e4f16155cefe52c66.scope: Deactivated successfully.
Jan 26 10:10:57 compute-1 podman[232622]: 2026-01-26 10:10:57.934841921 +0000 UTC m=+1.060768981 container died c332c6308b59c020b68c7df7148137ea27f67c5d6be2aa9e4f16155cefe52c66 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_lederberg, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 10:10:58 compute-1 systemd[1]: var-lib-containers-storage-overlay-a50efcbbac7411183286ce8649f8495c8917c72edcff13946694a5932b1ab55e-merged.mount: Deactivated successfully.
Jan 26 10:10:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:10:58 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:10:58 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:10:58 compute-1 ceph-mon[80107]: pgmap v862: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 407 KiB/s rd, 2.1 MiB/s wr, 94 op/s
Jan 26 10:10:58 compute-1 podman[232622]: 2026-01-26 10:10:58.641963001 +0000 UTC m=+1.767890061 container remove c332c6308b59c020b68c7df7148137ea27f67c5d6be2aa9e4f16155cefe52c66 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_lederberg, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Jan 26 10:10:58 compute-1 systemd[1]: libpod-conmon-c332c6308b59c020b68c7df7148137ea27f67c5d6be2aa9e4f16155cefe52c66.scope: Deactivated successfully.
Jan 26 10:10:58 compute-1 sudo[232518]: pam_unix(sudo:session): session closed for user root
Jan 26 10:10:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:10:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:59 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:10:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:59 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:10:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:59 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:10:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:59.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:10:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:10:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:59.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.397 226326 DEBUG oslo_concurrency.lockutils [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.398 226326 DEBUG oslo_concurrency.lockutils [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.398 226326 DEBUG oslo_concurrency.lockutils [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.399 226326 DEBUG oslo_concurrency.lockutils [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.399 226326 DEBUG oslo_concurrency.lockutils [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.400 226326 INFO nova.compute.manager [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Terminating instance
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.402 226326 DEBUG nova.compute.manager [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 10:10:59 compute-1 kernel: tapa3edb0bf-fe (unregistering): left promiscuous mode
Jan 26 10:10:59 compute-1 NetworkManager[49073]: <info>  [1769422259.4645] device (tapa3edb0bf-fe): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 10:10:59 compute-1 ovn_controller[133670]: 2026-01-26T10:10:59Z|00044|binding|INFO|Releasing lport a3edb0bf-fece-4486-bde7-200b47a3207f from this chassis (sb_readonly=0)
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.475 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:59 compute-1 ovn_controller[133670]: 2026-01-26T10:10:59Z|00045|binding|INFO|Setting lport a3edb0bf-fece-4486-bde7-200b47a3207f down in Southbound
Jan 26 10:10:59 compute-1 ovn_controller[133670]: 2026-01-26T10:10:59Z|00046|binding|INFO|Removing iface tapa3edb0bf-fe ovn-installed in OVS
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.478 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:59 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.485 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:93:05:f9 10.100.0.6'], port_security=['fa:16:3e:93:05:f9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'eafb89b3-bd76-4bd7-9ba0-c169e9e02d17', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-586582bb-e1bc-4aa4-9785-243b8df2dbcb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9fcc6dd6-8be1-499a-bb83-b80d1cff1d47', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=83c6eda3-fd53-4ca0-bad4-44f3e56aac7f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], logical_port=a3edb0bf-fece-4486-bde7-200b47a3207f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 10:10:59 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.487 143326 INFO neutron.agent.ovn.metadata.agent [-] Port a3edb0bf-fece-4486-bde7-200b47a3207f in datapath 586582bb-e1bc-4aa4-9785-243b8df2dbcb unbound from our chassis
Jan 26 10:10:59 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.489 143326 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 586582bb-e1bc-4aa4-9785-243b8df2dbcb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 10:10:59 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.490 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[fbe0008a-4af2-4ad5-8e15-1ad746308f21]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:10:59 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.491 143326 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb namespace which is not needed anymore
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.506 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:59 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Deactivated successfully.
Jan 26 10:10:59 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Consumed 15.356s CPU time.
Jan 26 10:10:59 compute-1 systemd-machined[194876]: Machine qemu-2-instance-00000004 terminated.
Jan 26 10:10:59 compute-1 sshd-session[232942]: Connection reset by authenticating user root 176.120.22.13 port 29624 [preauth]
Jan 26 10:10:59 compute-1 neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb[232069]: [NOTICE]   (232073) : haproxy version is 2.8.14-c23fe91
Jan 26 10:10:59 compute-1 neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb[232069]: [NOTICE]   (232073) : path to executable is /usr/sbin/haproxy
Jan 26 10:10:59 compute-1 neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb[232069]: [WARNING]  (232073) : Exiting Master process...
Jan 26 10:10:59 compute-1 neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb[232069]: [WARNING]  (232073) : Exiting Master process...
Jan 26 10:10:59 compute-1 neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb[232069]: [ALERT]    (232073) : Current worker (232075) exited with code 143 (Terminated)
Jan 26 10:10:59 compute-1 neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb[232069]: [WARNING]  (232073) : All workers exited. Exiting... (0)
Jan 26 10:10:59 compute-1 systemd[1]: libpod-6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705.scope: Deactivated successfully.
Jan 26 10:10:59 compute-1 podman[234033]: 2026-01-26 10:10:59.646961116 +0000 UTC m=+0.049708079 container died 6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.646 226326 INFO nova.virt.libvirt.driver [-] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Instance destroyed successfully.
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.648 226326 DEBUG nova.objects.instance [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'resources' on Instance uuid eafb89b3-bd76-4bd7-9ba0-c169e9e02d17 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.668 226326 DEBUG nova.virt.libvirt.vif [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T10:09:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1372750228',display_name='tempest-TestNetworkBasicOps-server-1372750228',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1372750228',id=4,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMsvtjjj+6XI+gg1E4Xzo+HtbXxyF3peabH1GoeKNDwXTbi9v3JnK6KlG1vj7QQfwaNJGTtKqqAkDphs9wj+ke0O+NbixfA149oWC+Na1rLDGTUpAAOvTtxkuWhA6lVC4w==',key_name='tempest-TestNetworkBasicOps-796997991',keypairs=<?>,launch_index=0,launched_at=2026-01-26T10:09:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-318o5yar',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T10:09:57Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=eafb89b3-bd76-4bd7-9ba0-c169e9e02d17,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.669 226326 DEBUG nova.network.os_vif_util [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.670 226326 DEBUG nova.network.os_vif_util [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:93:05:f9,bridge_name='br-int',has_traffic_filtering=True,id=a3edb0bf-fece-4486-bde7-200b47a3207f,network=Network(586582bb-e1bc-4aa4-9785-243b8df2dbcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3edb0bf-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.670 226326 DEBUG os_vif [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:93:05:f9,bridge_name='br-int',has_traffic_filtering=True,id=a3edb0bf-fece-4486-bde7-200b47a3207f,network=Network(586582bb-e1bc-4aa4-9785-243b8df2dbcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3edb0bf-fe') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.673 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.673 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3edb0bf-fe, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.676 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.677 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:59 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705-userdata-shm.mount: Deactivated successfully.
Jan 26 10:10:59 compute-1 systemd[1]: var-lib-containers-storage-overlay-a3d3f80d88a6c2d2bff340e38e70bb3d56800b97343639f380b5d8e3058e1a4a-merged.mount: Deactivated successfully.
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.682 226326 INFO os_vif [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:93:05:f9,bridge_name='br-int',has_traffic_filtering=True,id=a3edb0bf-fece-4486-bde7-200b47a3207f,network=Network(586582bb-e1bc-4aa4-9785-243b8df2dbcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3edb0bf-fe')
Jan 26 10:10:59 compute-1 podman[234033]: 2026-01-26 10:10:59.693939681 +0000 UTC m=+0.096686644 container cleanup 6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 10:10:59 compute-1 systemd[1]: libpod-conmon-6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705.scope: Deactivated successfully.
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.736 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:59 compute-1 podman[234089]: 2026-01-26 10:10:59.754810212 +0000 UTC m=+0.041700298 container remove 6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 10:10:59 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.760 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[043cc56d-10f0-40bd-b4f0-33709d550f90]: (4, ('Mon Jan 26 10:10:59 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb (6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705)\n6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705\nMon Jan 26 10:10:59 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb (6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705)\n6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:10:59 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.762 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[e4f60603-906f-4fd0-ad78-c49e3e03e73e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:10:59 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.763 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap586582bb-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.764 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:59 compute-1 kernel: tap586582bb-e0: left promiscuous mode
Jan 26 10:10:59 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:10:59 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:10:59 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:10:59 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:10:59 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:10:59 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:10:59 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:10:59 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:10:59 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.778 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.779 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:10:59 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.782 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[c9a7bfdc-8da5-4f03-9396-891b3119744a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:10:59 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.797 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[f8653c71-2074-4386-a1f4-4ef029cb1c6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:10:59 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.798 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[2522c824-0720-4e69-a078-49f421f6b4e1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:10:59 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.812 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[629acbac-baec-4532-a555-4dfdc983b124]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415715, 'reachable_time': 37047, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234108, 'error': None, 'target': 'ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:10:59 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.815 143615 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 26 10:10:59 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.816 143615 DEBUG oslo.privsep.daemon [-] privsep: reply[244f096f-8ad6-44aa-9784-4adb4fc9aa54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:10:59 compute-1 systemd[1]: run-netns-ovnmeta\x2d586582bb\x2de1bc\x2d4aa4\x2d9785\x2d243b8df2dbcb.mount: Deactivated successfully.
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.922 226326 DEBUG nova.compute.manager [req-f7b97531-55f7-4884-9752-aba67056ded4 req-6f1a8d97-d97d-4c64-96b7-8cc98b2ee7ef b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Received event network-vif-unplugged-a3edb0bf-fece-4486-bde7-200b47a3207f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.922 226326 DEBUG oslo_concurrency.lockutils [req-f7b97531-55f7-4884-9752-aba67056ded4 req-6f1a8d97-d97d-4c64-96b7-8cc98b2ee7ef b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.923 226326 DEBUG oslo_concurrency.lockutils [req-f7b97531-55f7-4884-9752-aba67056ded4 req-6f1a8d97-d97d-4c64-96b7-8cc98b2ee7ef b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.923 226326 DEBUG oslo_concurrency.lockutils [req-f7b97531-55f7-4884-9752-aba67056ded4 req-6f1a8d97-d97d-4c64-96b7-8cc98b2ee7ef b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.923 226326 DEBUG nova.compute.manager [req-f7b97531-55f7-4884-9752-aba67056ded4 req-6f1a8d97-d97d-4c64-96b7-8cc98b2ee7ef b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] No waiting events found dispatching network-vif-unplugged-a3edb0bf-fece-4486-bde7-200b47a3207f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 10:10:59 compute-1 nova_compute[226322]: 2026-01-26 10:10:59.924 226326 DEBUG nova.compute.manager [req-f7b97531-55f7-4884-9752-aba67056ded4 req-6f1a8d97-d97d-4c64-96b7-8cc98b2ee7ef b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Received event network-vif-unplugged-a3edb0bf-fece-4486-bde7-200b47a3207f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 26 10:11:00 compute-1 nova_compute[226322]: 2026-01-26 10:11:00.130 226326 INFO nova.virt.libvirt.driver [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Deleting instance files /var/lib/nova/instances/eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_del
Jan 26 10:11:00 compute-1 nova_compute[226322]: 2026-01-26 10:11:00.130 226326 INFO nova.virt.libvirt.driver [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Deletion of /var/lib/nova/instances/eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_del complete
Jan 26 10:11:00 compute-1 nova_compute[226322]: 2026-01-26 10:11:00.181 226326 INFO nova.compute.manager [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Took 0.78 seconds to destroy the instance on the hypervisor.
Jan 26 10:11:00 compute-1 nova_compute[226322]: 2026-01-26 10:11:00.181 226326 DEBUG oslo.service.loopingcall [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 26 10:11:00 compute-1 nova_compute[226322]: 2026-01-26 10:11:00.182 226326 DEBUG nova.compute.manager [-] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 10:11:00 compute-1 nova_compute[226322]: 2026-01-26 10:11:00.182 226326 DEBUG nova.network.neutron [-] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 10:11:00 compute-1 ceph-mon[80107]: pgmap v863: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 407 KiB/s rd, 2.1 MiB/s wr, 94 op/s
Jan 26 10:11:01 compute-1 nova_compute[226322]: 2026-01-26 10:11:01.111 226326 DEBUG nova.network.neutron [-] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:11:01 compute-1 nova_compute[226322]: 2026-01-26 10:11:01.129 226326 INFO nova.compute.manager [-] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Took 0.95 seconds to deallocate network for instance.
Jan 26 10:11:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:01.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:01.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:01 compute-1 nova_compute[226322]: 2026-01-26 10:11:01.332 226326 DEBUG oslo_concurrency.lockutils [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:11:01 compute-1 nova_compute[226322]: 2026-01-26 10:11:01.332 226326 DEBUG oslo_concurrency.lockutils [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:11:01 compute-1 nova_compute[226322]: 2026-01-26 10:11:01.607 226326 DEBUG oslo_concurrency.processutils [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:11:02 compute-1 nova_compute[226322]: 2026-01-26 10:11:02.067 226326 DEBUG nova.compute.manager [req-31c6a930-95af-4209-862d-8a5fc4c9c5f6 req-12ef8123-4b45-4d45-b567-2286c64009cb b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Received event network-vif-plugged-a3edb0bf-fece-4486-bde7-200b47a3207f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:11:02 compute-1 nova_compute[226322]: 2026-01-26 10:11:02.068 226326 DEBUG oslo_concurrency.lockutils [req-31c6a930-95af-4209-862d-8a5fc4c9c5f6 req-12ef8123-4b45-4d45-b567-2286c64009cb b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:11:02 compute-1 nova_compute[226322]: 2026-01-26 10:11:02.068 226326 DEBUG oslo_concurrency.lockutils [req-31c6a930-95af-4209-862d-8a5fc4c9c5f6 req-12ef8123-4b45-4d45-b567-2286c64009cb b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:11:02 compute-1 nova_compute[226322]: 2026-01-26 10:11:02.069 226326 DEBUG oslo_concurrency.lockutils [req-31c6a930-95af-4209-862d-8a5fc4c9c5f6 req-12ef8123-4b45-4d45-b567-2286c64009cb b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:11:02 compute-1 nova_compute[226322]: 2026-01-26 10:11:02.069 226326 DEBUG nova.compute.manager [req-31c6a930-95af-4209-862d-8a5fc4c9c5f6 req-12ef8123-4b45-4d45-b567-2286c64009cb b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] No waiting events found dispatching network-vif-plugged-a3edb0bf-fece-4486-bde7-200b47a3207f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 10:11:02 compute-1 nova_compute[226322]: 2026-01-26 10:11:02.070 226326 WARNING nova.compute.manager [req-31c6a930-95af-4209-862d-8a5fc4c9c5f6 req-12ef8123-4b45-4d45-b567-2286c64009cb b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Received unexpected event network-vif-plugged-a3edb0bf-fece-4486-bde7-200b47a3207f for instance with vm_state deleted and task_state None.
Jan 26 10:11:02 compute-1 nova_compute[226322]: 2026-01-26 10:11:02.070 226326 DEBUG nova.compute.manager [req-31c6a930-95af-4209-862d-8a5fc4c9c5f6 req-12ef8123-4b45-4d45-b567-2286c64009cb b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Received event network-vif-deleted-a3edb0bf-fece-4486-bde7-200b47a3207f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:11:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:11:02 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3970104987' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:11:02 compute-1 nova_compute[226322]: 2026-01-26 10:11:02.120 226326 DEBUG oslo_concurrency.processutils [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:11:02 compute-1 nova_compute[226322]: 2026-01-26 10:11:02.128 226326 DEBUG nova.compute.provider_tree [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:11:02 compute-1 nova_compute[226322]: 2026-01-26 10:11:02.143 226326 DEBUG nova.scheduler.client.report [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:11:02 compute-1 nova_compute[226322]: 2026-01-26 10:11:02.174 226326 DEBUG oslo_concurrency.lockutils [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.842s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:11:02 compute-1 nova_compute[226322]: 2026-01-26 10:11:02.219 226326 INFO nova.scheduler.client.report [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Deleted allocations for instance eafb89b3-bd76-4bd7-9ba0-c169e9e02d17
Jan 26 10:11:02 compute-1 nova_compute[226322]: 2026-01-26 10:11:02.286 226326 DEBUG oslo_concurrency.lockutils [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.888s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:11:02 compute-1 sshd-session[234105]: Invalid user onlime_r from 176.120.22.13 port 29638
Jan 26 10:11:02 compute-1 ceph-mon[80107]: pgmap v864: 353 pgs: 353 active+clean; 41 MiB data, 312 MiB used, 60 GiB / 60 GiB avail; 419 KiB/s rd, 2.1 MiB/s wr, 111 op/s
Jan 26 10:11:02 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3970104987' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:11:03 compute-1 sshd-session[234105]: Connection reset by invalid user onlime_r 176.120.22.13 port 29638 [preauth]
Jan 26 10:11:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:11:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:03.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:11:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:11:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:03.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:11:03 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:11:03 compute-1 sudo[234136]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:11:03 compute-1 sudo[234136]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:11:03 compute-1 sudo[234136]: pam_unix(sudo:session): session closed for user root
Jan 26 10:11:03 compute-1 sudo[234163]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:11:03 compute-1 sudo[234163]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:11:03 compute-1 sudo[234163]: pam_unix(sudo:session): session closed for user root
Jan 26 10:11:03 compute-1 podman[234161]: 2026-01-26 10:11:03.754067325 +0000 UTC m=+0.101636699 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 10:11:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:11:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:11:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:11:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:04 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:11:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:11:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:11:04 compute-1 ceph-mon[80107]: pgmap v865: 353 pgs: 353 active+clean; 41 MiB data, 312 MiB used, 60 GiB / 60 GiB avail; 32 KiB/s rd, 14 KiB/s wr, 46 op/s
Jan 26 10:11:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:11:04 compute-1 nova_compute[226322]: 2026-01-26 10:11:04.676 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:04 compute-1 nova_compute[226322]: 2026-01-26 10:11:04.737 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:04 compute-1 sshd-session[234134]: Invalid user admin from 176.120.22.13 port 64194
Jan 26 10:11:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:11:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:05.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:11:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:11:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:05.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:11:05 compute-1 sshd-session[234134]: Connection reset by invalid user admin 176.120.22.13 port 64194 [preauth]
Jan 26 10:11:05 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 10:11:05 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Cumulative writes: 5443 writes, 28K keys, 5443 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.04 MB/s
                                           Cumulative WAL: 5443 writes, 5443 syncs, 1.00 writes per sync, written: 0.07 GB, 0.04 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1537 writes, 7358 keys, 1537 commit groups, 1.0 writes per commit group, ingest: 17.15 MB, 0.03 MB/s
                                           Interval WAL: 1537 writes, 1537 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     83.4      0.53              0.14        15    0.035       0      0       0.0       0.0
                                             L6      1/0   13.88 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   4.1    150.2    128.7      1.39              0.42        14    0.099     73K   7423       0.0       0.0
                                            Sum      1/0   13.88 MB   0.0      0.2     0.0      0.2       0.2      0.1       0.0   5.1    109.0    116.3      1.92              0.55        29    0.066     73K   7423       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   6.7     94.2     96.0      0.79              0.20        10    0.079     29K   2577       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   0.0    150.2    128.7      1.39              0.42        14    0.099     73K   7423       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     83.8      0.52              0.14        14    0.037       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.043, interval 0.011
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.22 GB write, 0.12 MB/s write, 0.20 GB read, 0.12 MB/s read, 1.9 seconds
                                           Interval compaction: 0.07 GB write, 0.13 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.8 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55af2cb209b0#2 capacity: 304.00 MB usage: 17.12 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.000131 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(916,16.53 MB,5.43732%) FilterBlock(29,217.11 KB,0.0697437%) IndexBlock(29,387.08 KB,0.124344%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Jan 26 10:11:06 compute-1 ceph-mon[80107]: pgmap v866: 353 pgs: 353 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 39 KiB/s rd, 15 KiB/s wr, 57 op/s
Jan 26 10:11:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:07.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:11:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:07.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:11:07 compute-1 sshd-session[234214]: Invalid user admin from 176.120.22.13 port 64236
Jan 26 10:11:07 compute-1 sshd-session[234214]: Connection reset by invalid user admin 176.120.22.13 port 64236 [preauth]
Jan 26 10:11:08 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:11:08 compute-1 ceph-mon[80107]: pgmap v867: 353 pgs: 353 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Jan 26 10:11:08 compute-1 nova_compute[226322]: 2026-01-26 10:11:08.895 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:08 compute-1 nova_compute[226322]: 2026-01-26 10:11:08.981 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:11:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:09 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:11:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:09 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:11:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:09 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:11:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:09.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:09.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:09 compute-1 nova_compute[226322]: 2026-01-26 10:11:09.679 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:09 compute-1 nova_compute[226322]: 2026-01-26 10:11:09.739 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:10 compute-1 ceph-mon[80107]: pgmap v868: 353 pgs: 353 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Jan 26 10:11:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:11.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:11.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:12 compute-1 ceph-mon[80107]: pgmap v869: 353 pgs: 353 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Jan 26 10:11:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:11:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:13.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:11:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:11:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:13.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:11:13 compute-1 sshd-session[234222]: Invalid user admin from 152.42.136.110 port 42102
Jan 26 10:11:13 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:11:13 compute-1 sshd-session[234222]: Connection closed by invalid user admin 152.42.136.110 port 42102 [preauth]
Jan 26 10:11:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:11:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:11:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:11:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:14 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:11:14 compute-1 nova_compute[226322]: 2026-01-26 10:11:14.644 226326 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769422259.6430793, eafb89b3-bd76-4bd7-9ba0-c169e9e02d17 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 10:11:14 compute-1 nova_compute[226322]: 2026-01-26 10:11:14.645 226326 INFO nova.compute.manager [-] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] VM Stopped (Lifecycle Event)
Jan 26 10:11:14 compute-1 nova_compute[226322]: 2026-01-26 10:11:14.661 226326 DEBUG nova.compute.manager [None req-60fa603f-98a4-42e2-9ad8-ee1dff2adb4c - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 10:11:14 compute-1 nova_compute[226322]: 2026-01-26 10:11:14.682 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:14 compute-1 ceph-mon[80107]: pgmap v870: 353 pgs: 353 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 7.9 KiB/s rd, 597 B/s wr, 10 op/s
Jan 26 10:11:14 compute-1 nova_compute[226322]: 2026-01-26 10:11:14.741 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:15.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:15.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:16 compute-1 sshd-session[234225]: Invalid user admin from 178.62.249.31 port 45966
Jan 26 10:11:16 compute-1 ceph-mon[80107]: pgmap v871: 353 pgs: 353 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 8.3 KiB/s rd, 597 B/s wr, 11 op/s
Jan 26 10:11:16 compute-1 sshd-session[234225]: Connection closed by invalid user admin 178.62.249.31 port 45966 [preauth]
Jan 26 10:11:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:17.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:17.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:18 compute-1 podman[234229]: 2026-01-26 10:11:18.297004382 +0000 UTC m=+0.062639612 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 10:11:18 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:11:18 compute-1 ceph-mon[80107]: pgmap v872: 353 pgs: 353 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 0 op/s
Jan 26 10:11:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:11:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:11:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:11:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:11:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:11:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:19.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:11:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:19.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:11:19 compute-1 nova_compute[226322]: 2026-01-26 10:11:19.685 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:19 compute-1 nova_compute[226322]: 2026-01-26 10:11:19.744 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:20 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:11:20.055 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 10:11:20 compute-1 nova_compute[226322]: 2026-01-26 10:11:20.056 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:20 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:11:20.056 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 10:11:20 compute-1 ceph-mon[80107]: pgmap v873: 353 pgs: 353 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 0 op/s
Jan 26 10:11:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:21.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:21.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:22 compute-1 ceph-mon[80107]: pgmap v874: 353 pgs: 353 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 0 op/s
Jan 26 10:11:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:11:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:23.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:11:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:23.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:23 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:11:23 compute-1 sudo[234251]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:11:23 compute-1 sudo[234251]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:11:23 compute-1 sudo[234251]: pam_unix(sudo:session): session closed for user root
Jan 26 10:11:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:11:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:11:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:11:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:11:24 compute-1 nova_compute[226322]: 2026-01-26 10:11:24.688 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:24 compute-1 nova_compute[226322]: 2026-01-26 10:11:24.748 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:24 compute-1 ceph-mon[80107]: pgmap v875: 353 pgs: 353 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 0 op/s
Jan 26 10:11:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:11:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:25.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:11:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:25.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:25 compute-1 nova_compute[226322]: 2026-01-26 10:11:25.682 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:11:26 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2018372751' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:11:27 compute-1 ceph-mon[80107]: pgmap v876: 353 pgs: 353 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 1 op/s
Jan 26 10:11:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:11:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:27.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:11:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:11:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:27.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:11:28 compute-1 ceph-mon[80107]: pgmap v877: 353 pgs: 353 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 0 op/s
Jan 26 10:11:28 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:11:28 compute-1 nova_compute[226322]: 2026-01-26 10:11:28.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:11:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:11:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:11:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:11:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:29 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:11:29 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:11:29.058 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:11:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:29.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:11:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:29.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:11:29 compute-1 nova_compute[226322]: 2026-01-26 10:11:29.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:11:29 compute-1 nova_compute[226322]: 2026-01-26 10:11:29.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 26 10:11:29 compute-1 nova_compute[226322]: 2026-01-26 10:11:29.691 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:29 compute-1 nova_compute[226322]: 2026-01-26 10:11:29.703 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 26 10:11:29 compute-1 nova_compute[226322]: 2026-01-26 10:11:29.703 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:11:29 compute-1 nova_compute[226322]: 2026-01-26 10:11:29.704 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 26 10:11:29 compute-1 nova_compute[226322]: 2026-01-26 10:11:29.749 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:30 compute-1 ceph-mon[80107]: pgmap v878: 353 pgs: 353 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 0 op/s
Jan 26 10:11:30 compute-1 nova_compute[226322]: 2026-01-26 10:11:30.726 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:11:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:11:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:31.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:11:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:31.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:31 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1411173654' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:11:32 compute-1 nova_compute[226322]: 2026-01-26 10:11:32.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:11:32 compute-1 nova_compute[226322]: 2026-01-26 10:11:32.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:11:32 compute-1 nova_compute[226322]: 2026-01-26 10:11:32.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:11:32 compute-1 nova_compute[226322]: 2026-01-26 10:11:32.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:11:32 compute-1 nova_compute[226322]: 2026-01-26 10:11:32.701 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 10:11:32 compute-1 nova_compute[226322]: 2026-01-26 10:11:32.702 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:11:32 compute-1 nova_compute[226322]: 2026-01-26 10:11:32.702 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:11:32 compute-1 nova_compute[226322]: 2026-01-26 10:11:32.702 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:11:32 compute-1 nova_compute[226322]: 2026-01-26 10:11:32.702 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:11:32 compute-1 ceph-mon[80107]: pgmap v879: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 26 10:11:32 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3698987314' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:11:32 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1364752818' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:11:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:11:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:33.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:11:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:11:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:33.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:11:33 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:11:33 compute-1 nova_compute[226322]: 2026-01-26 10:11:33.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:11:33 compute-1 nova_compute[226322]: 2026-01-26 10:11:33.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:11:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:11:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:11:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:11:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:34 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:11:34 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2361096299' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:11:34 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1050430947' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:11:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:11:34 compute-1 podman[234281]: 2026-01-26 10:11:34.296170875 +0000 UTC m=+0.075617098 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 10:11:34 compute-1 nova_compute[226322]: 2026-01-26 10:11:34.691 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:34 compute-1 nova_compute[226322]: 2026-01-26 10:11:34.751 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:35 compute-1 ceph-mon[80107]: pgmap v880: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 26 10:11:35 compute-1 nova_compute[226322]: 2026-01-26 10:11:35.154 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:11:35 compute-1 nova_compute[226322]: 2026-01-26 10:11:35.190 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:11:35 compute-1 nova_compute[226322]: 2026-01-26 10:11:35.190 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:11:35 compute-1 nova_compute[226322]: 2026-01-26 10:11:35.190 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:11:35 compute-1 nova_compute[226322]: 2026-01-26 10:11:35.191 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:11:35 compute-1 nova_compute[226322]: 2026-01-26 10:11:35.191 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:11:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:11:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:35.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:11:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:35.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:35 compute-1 sshd-session[234309]: Invalid user centos from 104.248.198.71 port 44332
Jan 26 10:11:35 compute-1 sshd-session[234309]: Connection closed by invalid user centos 104.248.198.71 port 44332 [preauth]
Jan 26 10:11:35 compute-1 nova_compute[226322]: 2026-01-26 10:11:35.663 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:11:35 compute-1 nova_compute[226322]: 2026-01-26 10:11:35.816 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:11:35 compute-1 nova_compute[226322]: 2026-01-26 10:11:35.817 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4913MB free_disk=59.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:11:35 compute-1 nova_compute[226322]: 2026-01-26 10:11:35.818 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:11:35 compute-1 nova_compute[226322]: 2026-01-26 10:11:35.818 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:11:35 compute-1 nova_compute[226322]: 2026-01-26 10:11:35.944 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:11:35 compute-1 nova_compute[226322]: 2026-01-26 10:11:35.944 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:11:36 compute-1 nova_compute[226322]: 2026-01-26 10:11:36.004 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:11:36 compute-1 ceph-mon[80107]: pgmap v881: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 38 op/s
Jan 26 10:11:36 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2607926891' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:11:36 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1296608185' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:11:36 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:11:36 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3663071606' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:11:36 compute-1 nova_compute[226322]: 2026-01-26 10:11:36.657 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.653s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:11:36 compute-1 nova_compute[226322]: 2026-01-26 10:11:36.662 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:11:36 compute-1 nova_compute[226322]: 2026-01-26 10:11:36.686 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:11:36 compute-1 nova_compute[226322]: 2026-01-26 10:11:36.707 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:11:36 compute-1 nova_compute[226322]: 2026-01-26 10:11:36.707 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.889s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:11:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:11:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:37.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:11:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:37.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:37 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3663071606' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:11:38 compute-1 ceph-mon[80107]: pgmap v882: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Jan 26 10:11:38 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:11:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:11:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:11:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:11:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:39 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:11:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:39.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:39.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:39 compute-1 nova_compute[226322]: 2026-01-26 10:11:39.693 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:39 compute-1 nova_compute[226322]: 2026-01-26 10:11:39.752 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:40 compute-1 ceph-mon[80107]: pgmap v883: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Jan 26 10:11:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:41.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:41.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:42 compute-1 ceph-mon[80107]: pgmap v884: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Jan 26 10:11:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:11:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:43.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:11:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:43.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:43 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:11:43 compute-1 sudo[234361]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:11:43 compute-1 sudo[234361]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:11:43 compute-1 sudo[234361]: pam_unix(sudo:session): session closed for user root
Jan 26 10:11:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:11:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:11:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:11:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:44 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:11:44 compute-1 ceph-mon[80107]: pgmap v885: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 26 10:11:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:45.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:45.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:46 compute-1 nova_compute[226322]: 2026-01-26 10:11:44.695 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:46 compute-1 nova_compute[226322]: 2026-01-26 10:11:44.753 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:46 compute-1 ceph-mon[80107]: pgmap v886: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 26 10:11:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:11:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:47.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:11:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:47.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:47 compute-1 ovn_controller[133670]: 2026-01-26T10:11:47Z|00047|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 26 10:11:48 compute-1 ceph-mon[80107]: pgmap v887: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 64 op/s
Jan 26 10:11:48 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:11:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:11:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:11:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:11:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:49 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:11:49 compute-1 nova_compute[226322]: 2026-01-26 10:11:49.199 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:11:49 compute-1 podman[234389]: 2026-01-26 10:11:49.265512925 +0000 UTC m=+0.044424925 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Jan 26 10:11:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:49.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:49.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:49 compute-1 nova_compute[226322]: 2026-01-26 10:11:49.697 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:49 compute-1 nova_compute[226322]: 2026-01-26 10:11:49.755 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:11:51 compute-1 ceph-mon[80107]: pgmap v888: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 64 op/s
Jan 26 10:11:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:51.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:51.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:52 compute-1 ceph-mon[80107]: pgmap v889: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 124 op/s
Jan 26 10:11:52 compute-1 sshd-session[234412]: Invalid user admin from 152.42.136.110 port 44244
Jan 26 10:11:52 compute-1 sshd-session[234412]: Connection closed by invalid user admin 152.42.136.110 port 44244 [preauth]
Jan 26 10:11:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:53.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:53.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:53 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:11:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:11:53.936 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:11:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:11:53.937 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:11:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:11:53.937 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:11:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:11:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:11:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:11:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:54 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:11:54 compute-1 nova_compute[226322]: 2026-01-26 10:11:54.699 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:54 compute-1 nova_compute[226322]: 2026-01-26 10:11:54.756 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:11:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:55.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:11:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:55.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:55 compute-1 ceph-mon[80107]: pgmap v890: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 251 KiB/s rd, 2.1 MiB/s wr, 60 op/s
Jan 26 10:11:56 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 26 10:11:57 compute-1 ceph-mon[80107]: pgmap v891: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 251 KiB/s rd, 2.1 MiB/s wr, 60 op/s
Jan 26 10:11:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:57.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:11:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:57.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:11:58 compute-1 ceph-mon[80107]: pgmap v892: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 251 KiB/s rd, 2.1 MiB/s wr, 60 op/s
Jan 26 10:11:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:11:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:11:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:11:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:11:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:59 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:11:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:11:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:59.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:11:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:11:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:11:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:59.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:11:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/3514844431' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:11:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/3514844431' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:11:59 compute-1 nova_compute[226322]: 2026-01-26 10:11:59.701 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:11:59 compute-1 nova_compute[226322]: 2026-01-26 10:11:59.757 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:00 compute-1 ceph-mon[80107]: pgmap v893: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 251 KiB/s rd, 2.1 MiB/s wr, 60 op/s
Jan 26 10:12:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:01.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:01.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:01 compute-1 sshd-session[234419]: Invalid user admin from 178.62.249.31 port 37762
Jan 26 10:12:02 compute-1 sshd-session[234419]: Connection closed by invalid user admin 178.62.249.31 port 37762 [preauth]
Jan 26 10:12:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:03.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:03.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:03 compute-1 ceph-mon[80107]: pgmap v894: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 251 KiB/s rd, 2.1 MiB/s wr, 61 op/s
Jan 26 10:12:03 compute-1 sudo[234423]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:12:03 compute-1 sudo[234423]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:12:03 compute-1 sudo[234423]: pam_unix(sudo:session): session closed for user root
Jan 26 10:12:03 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:12:03 compute-1 sudo[234448]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:12:03 compute-1 sudo[234448]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:12:03 compute-1 sudo[234448]: pam_unix(sudo:session): session closed for user root
Jan 26 10:12:04 compute-1 sudo[234473]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:12:04 compute-1 sudo[234473]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:12:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:04 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:12:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:04 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:12:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:04 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:12:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:04 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:12:04 compute-1 ceph-mon[80107]: pgmap v895: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 12 KiB/s wr, 1 op/s
Jan 26 10:12:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:12:04 compute-1 sudo[234473]: pam_unix(sudo:session): session closed for user root
Jan 26 10:12:04 compute-1 nova_compute[226322]: 2026-01-26 10:12:04.703 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:04 compute-1 nova_compute[226322]: 2026-01-26 10:12:04.762 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:12:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:05.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:12:05 compute-1 podman[234529]: 2026-01-26 10:12:05.354211985 +0000 UTC m=+0.127184297 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 26 10:12:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:12:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:05.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:12:07 compute-1 ceph-mon[80107]: pgmap v896: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 14 KiB/s wr, 1 op/s
Jan 26 10:12:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:12:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:07.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:12:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:07.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:08 compute-1 ceph-mon[80107]: pgmap v897: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 14 KiB/s wr, 1 op/s
Jan 26 10:12:08 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:12:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:12:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:12:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:12:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:09 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:12:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:12:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:09.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:12:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:09.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:09 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:12:09 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:12:09 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:12:09 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:12:09 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:12:09 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:12:09 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:12:09 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:12:09 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:12:09 compute-1 nova_compute[226322]: 2026-01-26 10:12:09.731 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:09 compute-1 nova_compute[226322]: 2026-01-26 10:12:09.765 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:10 compute-1 ceph-mon[80107]: pgmap v898: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 14 KiB/s wr, 1 op/s
Jan 26 10:12:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:11.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:12:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:11.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:12:12 compute-1 ceph-mon[80107]: pgmap v899: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 15 KiB/s wr, 1 op/s
Jan 26 10:12:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:13.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:12:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:13.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:12:13 compute-1 sudo[234558]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:12:13 compute-1 sudo[234558]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:12:13 compute-1 sudo[234558]: pam_unix(sudo:session): session closed for user root
Jan 26 10:12:13 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:12:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:12:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:12:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:12:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:14 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:12:14 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:12:14 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:12:14 compute-1 ceph-mon[80107]: pgmap v900: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 3.0 KiB/s wr, 0 op/s
Jan 26 10:12:14 compute-1 nova_compute[226322]: 2026-01-26 10:12:14.733 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:14 compute-1 nova_compute[226322]: 2026-01-26 10:12:14.766 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:15.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:12:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:15.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:12:17 compute-1 ceph-mon[80107]: pgmap v901: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 3.0 KiB/s wr, 1 op/s
Jan 26 10:12:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:17.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:12:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:17.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:12:18 compute-1 ceph-mon[80107]: pgmap v902: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 1023 B/s wr, 0 op/s
Jan 26 10:12:18 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:12:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:12:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:12:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:12:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:12:19 compute-1 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 26 10:12:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:19.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:12:19 compute-1 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 26 10:12:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:19.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:19 compute-1 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Jan 26 10:12:19 compute-1 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Jan 26 10:12:19 compute-1 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Jan 26 10:12:19 compute-1 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Jan 26 10:12:19 compute-1 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Jan 26 10:12:19 compute-1 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Jan 26 10:12:19 compute-1 nova_compute[226322]: 2026-01-26 10:12:19.736 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:19 compute-1 nova_compute[226322]: 2026-01-26 10:12:19.769 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:20 compute-1 podman[234587]: 2026-01-26 10:12:20.273889273 +0000 UTC m=+0.053848507 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 10:12:20 compute-1 ceph-mon[80107]: pgmap v903: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 1023 B/s wr, 0 op/s
Jan 26 10:12:21 compute-1 sshd-session[234607]: Invalid user centos from 104.248.198.71 port 42832
Jan 26 10:12:21 compute-1 sshd-session[234607]: Connection closed by invalid user centos 104.248.198.71 port 42832 [preauth]
Jan 26 10:12:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:12:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:21.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:12:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:21.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:21 compute-1 nova_compute[226322]: 2026-01-26 10:12:21.616 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:21 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:21.616 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 10:12:21 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:21.617 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 10:12:22 compute-1 ceph-mon[80107]: pgmap v904: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 29 KiB/s rd, 7.0 KiB/s wr, 46 op/s
Jan 26 10:12:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:12:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:23.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:12:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:23.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:23 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:12:23 compute-1 sudo[234611]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:12:23 compute-1 sudo[234611]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:12:23 compute-1 sudo[234611]: pam_unix(sudo:session): session closed for user root
Jan 26 10:12:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:12:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:12:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:12:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:12:24 compute-1 nova_compute[226322]: 2026-01-26 10:12:24.738 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:24 compute-1 nova_compute[226322]: 2026-01-26 10:12:24.771 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:24 compute-1 ceph-mon[80107]: pgmap v905: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 29 KiB/s rd, 6.0 KiB/s wr, 46 op/s
Jan 26 10:12:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:25.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:25.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:26 compute-1 nova_compute[226322]: 2026-01-26 10:12:26.472 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "17d96116-83a4-40d0-9dcd-bd5072238621" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:12:26 compute-1 nova_compute[226322]: 2026-01-26 10:12:26.473 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:12:26 compute-1 nova_compute[226322]: 2026-01-26 10:12:26.488 226326 DEBUG nova.compute.manager [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 10:12:26 compute-1 nova_compute[226322]: 2026-01-26 10:12:26.561 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:12:26 compute-1 nova_compute[226322]: 2026-01-26 10:12:26.562 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:12:26 compute-1 nova_compute[226322]: 2026-01-26 10:12:26.569 226326 DEBUG nova.virt.hardware [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 10:12:26 compute-1 nova_compute[226322]: 2026-01-26 10:12:26.570 226326 INFO nova.compute.claims [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Claim successful on node compute-1.ctlplane.example.com
Jan 26 10:12:26 compute-1 nova_compute[226322]: 2026-01-26 10:12:26.721 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:12:26 compute-1 ceph-mon[80107]: pgmap v906: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 61 KiB/s rd, 6.0 KiB/s wr, 99 op/s
Jan 26 10:12:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:12:27 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/981306923' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:12:27 compute-1 nova_compute[226322]: 2026-01-26 10:12:27.267 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.545s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:12:27 compute-1 nova_compute[226322]: 2026-01-26 10:12:27.274 226326 DEBUG nova.compute.provider_tree [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:12:27 compute-1 nova_compute[226322]: 2026-01-26 10:12:27.288 226326 DEBUG nova.scheduler.client.report [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:12:27 compute-1 nova_compute[226322]: 2026-01-26 10:12:27.314 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.752s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:12:27 compute-1 nova_compute[226322]: 2026-01-26 10:12:27.315 226326 DEBUG nova.compute.manager [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 10:12:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:12:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:27.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:12:27 compute-1 nova_compute[226322]: 2026-01-26 10:12:27.368 226326 DEBUG nova.compute.manager [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 10:12:27 compute-1 nova_compute[226322]: 2026-01-26 10:12:27.368 226326 DEBUG nova.network.neutron [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 10:12:27 compute-1 nova_compute[226322]: 2026-01-26 10:12:27.390 226326 INFO nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 10:12:27 compute-1 nova_compute[226322]: 2026-01-26 10:12:27.417 226326 DEBUG nova.compute.manager [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 10:12:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:27.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:27 compute-1 nova_compute[226322]: 2026-01-26 10:12:27.554 226326 DEBUG nova.compute.manager [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 10:12:27 compute-1 nova_compute[226322]: 2026-01-26 10:12:27.555 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 10:12:27 compute-1 nova_compute[226322]: 2026-01-26 10:12:27.555 226326 INFO nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Creating image(s)
Jan 26 10:12:27 compute-1 nova_compute[226322]: 2026-01-26 10:12:27.579 226326 DEBUG nova.storage.rbd_utils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 17d96116-83a4-40d0-9dcd-bd5072238621_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:12:27 compute-1 nova_compute[226322]: 2026-01-26 10:12:27.608 226326 DEBUG nova.storage.rbd_utils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 17d96116-83a4-40d0-9dcd-bd5072238621_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:12:27 compute-1 nova_compute[226322]: 2026-01-26 10:12:27.641 226326 DEBUG nova.storage.rbd_utils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 17d96116-83a4-40d0-9dcd-bd5072238621_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:12:27 compute-1 nova_compute[226322]: 2026-01-26 10:12:27.645 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:12:27 compute-1 nova_compute[226322]: 2026-01-26 10:12:27.704 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:12:27 compute-1 nova_compute[226322]: 2026-01-26 10:12:27.705 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "d81880e926e175d0cc7241caa7cc18231a8a289c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:12:27 compute-1 nova_compute[226322]: 2026-01-26 10:12:27.706 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "d81880e926e175d0cc7241caa7cc18231a8a289c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:12:27 compute-1 nova_compute[226322]: 2026-01-26 10:12:27.706 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "d81880e926e175d0cc7241caa7cc18231a8a289c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:12:27 compute-1 nova_compute[226322]: 2026-01-26 10:12:27.733 226326 DEBUG nova.storage.rbd_utils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 17d96116-83a4-40d0-9dcd-bd5072238621_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:12:27 compute-1 nova_compute[226322]: 2026-01-26 10:12:27.739 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c 17d96116-83a4-40d0-9dcd-bd5072238621_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:12:27 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/981306923' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:12:28 compute-1 nova_compute[226322]: 2026-01-26 10:12:28.045 226326 DEBUG nova.policy [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c1208d3e25b940ea93fe76884c7a53db', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 10:12:28 compute-1 nova_compute[226322]: 2026-01-26 10:12:28.111 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c 17d96116-83a4-40d0-9dcd-bd5072238621_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.372s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:12:28 compute-1 nova_compute[226322]: 2026-01-26 10:12:28.173 226326 DEBUG nova.storage.rbd_utils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] resizing rbd image 17d96116-83a4-40d0-9dcd-bd5072238621_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Jan 26 10:12:28 compute-1 nova_compute[226322]: 2026-01-26 10:12:28.359 226326 DEBUG nova.objects.instance [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'migration_context' on Instance uuid 17d96116-83a4-40d0-9dcd-bd5072238621 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 10:12:28 compute-1 nova_compute[226322]: 2026-01-26 10:12:28.388 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 26 10:12:28 compute-1 nova_compute[226322]: 2026-01-26 10:12:28.388 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Ensure instance console log exists: /var/lib/nova/instances/17d96116-83a4-40d0-9dcd-bd5072238621/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 26 10:12:28 compute-1 nova_compute[226322]: 2026-01-26 10:12:28.389 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:12:28 compute-1 nova_compute[226322]: 2026-01-26 10:12:28.389 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:12:28 compute-1 nova_compute[226322]: 2026-01-26 10:12:28.389 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:12:28 compute-1 nova_compute[226322]: 2026-01-26 10:12:28.802 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:12:28 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:12:28 compute-1 ceph-mon[80107]: pgmap v907: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 60 KiB/s rd, 6.0 KiB/s wr, 99 op/s
Jan 26 10:12:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:12:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:12:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:12:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:29 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:12:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:29.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:29.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:29 compute-1 nova_compute[226322]: 2026-01-26 10:12:29.740 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:29 compute-1 nova_compute[226322]: 2026-01-26 10:12:29.773 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:30 compute-1 nova_compute[226322]: 2026-01-26 10:12:30.145 226326 DEBUG nova.network.neutron [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Successfully created port: 93f5d287-34ec-424d-8776-df002083762e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 10:12:30 compute-1 ceph-mon[80107]: pgmap v908: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 60 KiB/s rd, 6.0 KiB/s wr, 99 op/s
Jan 26 10:12:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:31.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:12:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:31.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:12:31 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:31.619 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:12:31 compute-1 nova_compute[226322]: 2026-01-26 10:12:31.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:12:31 compute-1 sshd-session[234827]: Invalid user admin from 152.42.136.110 port 46356
Jan 26 10:12:32 compute-1 nova_compute[226322]: 2026-01-26 10:12:32.234 226326 DEBUG nova.network.neutron [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Successfully updated port: 93f5d287-34ec-424d-8776-df002083762e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 10:12:32 compute-1 nova_compute[226322]: 2026-01-26 10:12:32.250 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "refresh_cache-17d96116-83a4-40d0-9dcd-bd5072238621" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 10:12:32 compute-1 nova_compute[226322]: 2026-01-26 10:12:32.251 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquired lock "refresh_cache-17d96116-83a4-40d0-9dcd-bd5072238621" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 10:12:32 compute-1 nova_compute[226322]: 2026-01-26 10:12:32.251 226326 DEBUG nova.network.neutron [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 10:12:32 compute-1 sshd-session[234827]: Connection closed by invalid user admin 152.42.136.110 port 46356 [preauth]
Jan 26 10:12:32 compute-1 nova_compute[226322]: 2026-01-26 10:12:32.438 226326 DEBUG nova.network.neutron [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 10:12:32 compute-1 nova_compute[226322]: 2026-01-26 10:12:32.459 226326 DEBUG nova.compute.manager [req-c600f5a7-12f3-4785-b9bb-0f9ceac097d2 req-d2ea289b-91f1-4253-a9f7-4aa3ee7a8eed b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Received event network-changed-93f5d287-34ec-424d-8776-df002083762e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:12:32 compute-1 nova_compute[226322]: 2026-01-26 10:12:32.460 226326 DEBUG nova.compute.manager [req-c600f5a7-12f3-4785-b9bb-0f9ceac097d2 req-d2ea289b-91f1-4253-a9f7-4aa3ee7a8eed b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Refreshing instance network info cache due to event network-changed-93f5d287-34ec-424d-8776-df002083762e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 10:12:32 compute-1 nova_compute[226322]: 2026-01-26 10:12:32.460 226326 DEBUG oslo_concurrency.lockutils [req-c600f5a7-12f3-4785-b9bb-0f9ceac097d2 req-d2ea289b-91f1-4253-a9f7-4aa3ee7a8eed b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "refresh_cache-17d96116-83a4-40d0-9dcd-bd5072238621" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 10:12:32 compute-1 nova_compute[226322]: 2026-01-26 10:12:32.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:12:32 compute-1 nova_compute[226322]: 2026-01-26 10:12:32.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:12:32 compute-1 ceph-mon[80107]: pgmap v909: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 78 KiB/s rd, 1.8 MiB/s wr, 126 op/s
Jan 26 10:12:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:33.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:33.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.595 226326 DEBUG nova.network.neutron [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Updating instance_info_cache with network_info: [{"id": "93f5d287-34ec-424d-8776-df002083762e", "address": "fa:16:3e:00:c7:32", "network": {"id": "ae1cb66c-0987-4156-9bdb-cb2a08957306", "bridge": "br-int", "label": "tempest-network-smoke--514366077", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93f5d287-34", "ovs_interfaceid": "93f5d287-34ec-424d-8776-df002083762e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.637 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Releasing lock "refresh_cache-17d96116-83a4-40d0-9dcd-bd5072238621" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.638 226326 DEBUG nova.compute.manager [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Instance network_info: |[{"id": "93f5d287-34ec-424d-8776-df002083762e", "address": "fa:16:3e:00:c7:32", "network": {"id": "ae1cb66c-0987-4156-9bdb-cb2a08957306", "bridge": "br-int", "label": "tempest-network-smoke--514366077", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93f5d287-34", "ovs_interfaceid": "93f5d287-34ec-424d-8776-df002083762e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.638 226326 DEBUG oslo_concurrency.lockutils [req-c600f5a7-12f3-4785-b9bb-0f9ceac097d2 req-d2ea289b-91f1-4253-a9f7-4aa3ee7a8eed b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquired lock "refresh_cache-17d96116-83a4-40d0-9dcd-bd5072238621" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.639 226326 DEBUG nova.network.neutron [req-c600f5a7-12f3-4785-b9bb-0f9ceac097d2 req-d2ea289b-91f1-4253-a9f7-4aa3ee7a8eed b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Refreshing network info cache for port 93f5d287-34ec-424d-8776-df002083762e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.641 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Start _get_guest_xml network_info=[{"id": "93f5d287-34ec-424d-8776-df002083762e", "address": "fa:16:3e:00:c7:32", "network": {"id": "ae1cb66c-0987-4156-9bdb-cb2a08957306", "bridge": "br-int", "label": "tempest-network-smoke--514366077", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93f5d287-34", "ovs_interfaceid": "93f5d287-34ec-424d-8776-df002083762e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T10:05:39Z,direct_url=<?>,disk_format='qcow2',id=6789692f-fc1f-4efa-ae75-dcc13be695ef,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3ff3fa2a5531460b993c609589aa545d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T10:05:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'size': 0, 'image_id': '6789692f-fc1f-4efa-ae75-dcc13be695ef'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.646 226326 WARNING nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.661 226326 DEBUG nova.virt.libvirt.host [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.662 226326 DEBUG nova.virt.libvirt.host [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.667 226326 DEBUG nova.virt.libvirt.host [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.668 226326 DEBUG nova.virt.libvirt.host [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.668 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.669 226326 DEBUG nova.virt.hardware [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T10:05:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='57e1601b-dbfa-4d3b-8b96-27302e4a7a06',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T10:05:39Z,direct_url=<?>,disk_format='qcow2',id=6789692f-fc1f-4efa-ae75-dcc13be695ef,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3ff3fa2a5531460b993c609589aa545d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T10:05:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.669 226326 DEBUG nova.virt.hardware [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.669 226326 DEBUG nova.virt.hardware [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.670 226326 DEBUG nova.virt.hardware [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.670 226326 DEBUG nova.virt.hardware [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.670 226326 DEBUG nova.virt.hardware [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.670 226326 DEBUG nova.virt.hardware [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.671 226326 DEBUG nova.virt.hardware [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.671 226326 DEBUG nova.virt.hardware [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.671 226326 DEBUG nova.virt.hardware [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.671 226326 DEBUG nova.virt.hardware [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.674 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.692 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.693 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.693 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.693 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.843 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.844 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.845 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:12:33 compute-1 nova_compute[226322]: 2026-01-26 10:12:33.846 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:12:33 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1103791736' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:12:33 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2082041347' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:12:33 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:12:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:12:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:12:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:12:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:34 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:12:34 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 26 10:12:34 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2307260624' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.211 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.537s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.244 226326 DEBUG nova.storage.rbd_utils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 17d96116-83a4-40d0-9dcd-bd5072238621_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.249 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:12:34 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 26 10:12:34 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3132034729' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.686 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.688 226326 DEBUG nova.virt.libvirt.vif [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T10:12:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-200115615',display_name='tempest-TestNetworkBasicOps-server-200115615',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-200115615',id=7,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNdcGSrtCXDvk89I0Vz9B7It1mSwlDvkZdMdWLCvoVbAe00VgK1axvS1LkIm+2Wq3uLZqcGzySpB1p+E5CAX8FNwfd40H16DcgIFt/DJC5r2xLViVsjIsqcjjLML0XecLw==',key_name='tempest-TestNetworkBasicOps-1792120849',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-0x8cj090',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T10:12:27Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=17d96116-83a4-40d0-9dcd-bd5072238621,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "93f5d287-34ec-424d-8776-df002083762e", "address": "fa:16:3e:00:c7:32", "network": {"id": "ae1cb66c-0987-4156-9bdb-cb2a08957306", "bridge": "br-int", "label": "tempest-network-smoke--514366077", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93f5d287-34", "ovs_interfaceid": "93f5d287-34ec-424d-8776-df002083762e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.688 226326 DEBUG nova.network.os_vif_util [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "93f5d287-34ec-424d-8776-df002083762e", "address": "fa:16:3e:00:c7:32", "network": {"id": "ae1cb66c-0987-4156-9bdb-cb2a08957306", "bridge": "br-int", "label": "tempest-network-smoke--514366077", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93f5d287-34", "ovs_interfaceid": "93f5d287-34ec-424d-8776-df002083762e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.689 226326 DEBUG nova.network.os_vif_util [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:c7:32,bridge_name='br-int',has_traffic_filtering=True,id=93f5d287-34ec-424d-8776-df002083762e,network=Network(ae1cb66c-0987-4156-9bdb-cb2a08957306),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93f5d287-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.691 226326 DEBUG nova.objects.instance [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 17d96116-83a4-40d0-9dcd-bd5072238621 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.726 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] End _get_guest_xml xml=<domain type="kvm">
Jan 26 10:12:34 compute-1 nova_compute[226322]:   <uuid>17d96116-83a4-40d0-9dcd-bd5072238621</uuid>
Jan 26 10:12:34 compute-1 nova_compute[226322]:   <name>instance-00000007</name>
Jan 26 10:12:34 compute-1 nova_compute[226322]:   <memory>131072</memory>
Jan 26 10:12:34 compute-1 nova_compute[226322]:   <vcpu>1</vcpu>
Jan 26 10:12:34 compute-1 nova_compute[226322]:   <metadata>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <nova:name>tempest-TestNetworkBasicOps-server-200115615</nova:name>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <nova:creationTime>2026-01-26 10:12:33</nova:creationTime>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <nova:flavor name="m1.nano">
Jan 26 10:12:34 compute-1 nova_compute[226322]:         <nova:memory>128</nova:memory>
Jan 26 10:12:34 compute-1 nova_compute[226322]:         <nova:disk>1</nova:disk>
Jan 26 10:12:34 compute-1 nova_compute[226322]:         <nova:swap>0</nova:swap>
Jan 26 10:12:34 compute-1 nova_compute[226322]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 10:12:34 compute-1 nova_compute[226322]:         <nova:vcpus>1</nova:vcpus>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       </nova:flavor>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <nova:owner>
Jan 26 10:12:34 compute-1 nova_compute[226322]:         <nova:user uuid="c1208d3e25b940ea93fe76884c7a53db">tempest-TestNetworkBasicOps-966559857-project-member</nova:user>
Jan 26 10:12:34 compute-1 nova_compute[226322]:         <nova:project uuid="6ed221b375a44fc2bb2a8f232c5446e7">tempest-TestNetworkBasicOps-966559857</nova:project>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       </nova:owner>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <nova:root type="image" uuid="6789692f-fc1f-4efa-ae75-dcc13be695ef"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <nova:ports>
Jan 26 10:12:34 compute-1 nova_compute[226322]:         <nova:port uuid="93f5d287-34ec-424d-8776-df002083762e">
Jan 26 10:12:34 compute-1 nova_compute[226322]:           <nova:ip type="fixed" address="10.100.0.27" ipVersion="4"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:         </nova:port>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       </nova:ports>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     </nova:instance>
Jan 26 10:12:34 compute-1 nova_compute[226322]:   </metadata>
Jan 26 10:12:34 compute-1 nova_compute[226322]:   <sysinfo type="smbios">
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <system>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <entry name="manufacturer">RDO</entry>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <entry name="product">OpenStack Compute</entry>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <entry name="serial">17d96116-83a4-40d0-9dcd-bd5072238621</entry>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <entry name="uuid">17d96116-83a4-40d0-9dcd-bd5072238621</entry>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <entry name="family">Virtual Machine</entry>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     </system>
Jan 26 10:12:34 compute-1 nova_compute[226322]:   </sysinfo>
Jan 26 10:12:34 compute-1 nova_compute[226322]:   <os>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <boot dev="hd"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <smbios mode="sysinfo"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:   </os>
Jan 26 10:12:34 compute-1 nova_compute[226322]:   <features>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <acpi/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <apic/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <vmcoreinfo/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:   </features>
Jan 26 10:12:34 compute-1 nova_compute[226322]:   <clock offset="utc">
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <timer name="hpet" present="no"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:   </clock>
Jan 26 10:12:34 compute-1 nova_compute[226322]:   <cpu mode="host-model" match="exact">
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:   </cpu>
Jan 26 10:12:34 compute-1 nova_compute[226322]:   <devices>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <disk type="network" device="disk">
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <driver type="raw" cache="none"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <source protocol="rbd" name="vms/17d96116-83a4-40d0-9dcd-bd5072238621_disk">
Jan 26 10:12:34 compute-1 nova_compute[226322]:         <host name="192.168.122.100" port="6789"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:         <host name="192.168.122.102" port="6789"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:         <host name="192.168.122.101" port="6789"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       </source>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <auth username="openstack">
Jan 26 10:12:34 compute-1 nova_compute[226322]:         <secret type="ceph" uuid="1a70b85d-e3fd-5814-8a6a-37ea00fcae30"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       </auth>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <target dev="vda" bus="virtio"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     </disk>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <disk type="network" device="cdrom">
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <driver type="raw" cache="none"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <source protocol="rbd" name="vms/17d96116-83a4-40d0-9dcd-bd5072238621_disk.config">
Jan 26 10:12:34 compute-1 nova_compute[226322]:         <host name="192.168.122.100" port="6789"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:         <host name="192.168.122.102" port="6789"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:         <host name="192.168.122.101" port="6789"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       </source>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <auth username="openstack">
Jan 26 10:12:34 compute-1 nova_compute[226322]:         <secret type="ceph" uuid="1a70b85d-e3fd-5814-8a6a-37ea00fcae30"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       </auth>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <target dev="sda" bus="sata"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     </disk>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <interface type="ethernet">
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <mac address="fa:16:3e:00:c7:32"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <model type="virtio"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <mtu size="1442"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <target dev="tap93f5d287-34"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     </interface>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <serial type="pty">
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <log file="/var/lib/nova/instances/17d96116-83a4-40d0-9dcd-bd5072238621/console.log" append="off"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     </serial>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <video>
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <model type="virtio"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     </video>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <input type="tablet" bus="usb"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <rng model="virtio">
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <backend model="random">/dev/urandom</backend>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     </rng>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <controller type="usb" index="0"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     <memballoon model="virtio">
Jan 26 10:12:34 compute-1 nova_compute[226322]:       <stats period="10"/>
Jan 26 10:12:34 compute-1 nova_compute[226322]:     </memballoon>
Jan 26 10:12:34 compute-1 nova_compute[226322]:   </devices>
Jan 26 10:12:34 compute-1 nova_compute[226322]: </domain>
Jan 26 10:12:34 compute-1 nova_compute[226322]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.727 226326 DEBUG nova.compute.manager [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Preparing to wait for external event network-vif-plugged-93f5d287-34ec-424d-8776-df002083762e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.727 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.728 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.728 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.728 226326 DEBUG nova.virt.libvirt.vif [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T10:12:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-200115615',display_name='tempest-TestNetworkBasicOps-server-200115615',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-200115615',id=7,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNdcGSrtCXDvk89I0Vz9B7It1mSwlDvkZdMdWLCvoVbAe00VgK1axvS1LkIm+2Wq3uLZqcGzySpB1p+E5CAX8FNwfd40H16DcgIFt/DJC5r2xLViVsjIsqcjjLML0XecLw==',key_name='tempest-TestNetworkBasicOps-1792120849',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-0x8cj090',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T10:12:27Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=17d96116-83a4-40d0-9dcd-bd5072238621,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "93f5d287-34ec-424d-8776-df002083762e", "address": "fa:16:3e:00:c7:32", "network": {"id": "ae1cb66c-0987-4156-9bdb-cb2a08957306", "bridge": "br-int", "label": "tempest-network-smoke--514366077", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93f5d287-34", "ovs_interfaceid": "93f5d287-34ec-424d-8776-df002083762e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.729 226326 DEBUG nova.network.os_vif_util [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "93f5d287-34ec-424d-8776-df002083762e", "address": "fa:16:3e:00:c7:32", "network": {"id": "ae1cb66c-0987-4156-9bdb-cb2a08957306", "bridge": "br-int", "label": "tempest-network-smoke--514366077", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93f5d287-34", "ovs_interfaceid": "93f5d287-34ec-424d-8776-df002083762e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.729 226326 DEBUG nova.network.os_vif_util [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:c7:32,bridge_name='br-int',has_traffic_filtering=True,id=93f5d287-34ec-424d-8776-df002083762e,network=Network(ae1cb66c-0987-4156-9bdb-cb2a08957306),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93f5d287-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.730 226326 DEBUG os_vif [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:c7:32,bridge_name='br-int',has_traffic_filtering=True,id=93f5d287-34ec-424d-8776-df002083762e,network=Network(ae1cb66c-0987-4156-9bdb-cb2a08957306),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93f5d287-34') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.730 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.731 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.731 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.735 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.735 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap93f5d287-34, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.735 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap93f5d287-34, col_values=(('external_ids', {'iface-id': '93f5d287-34ec-424d-8776-df002083762e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:00:c7:32', 'vm-uuid': '17d96116-83a4-40d0-9dcd-bd5072238621'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.737 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:34 compute-1 NetworkManager[49073]: <info>  [1769422354.7393] manager: (tap93f5d287-34): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.739 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.745 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.747 226326 INFO os_vif [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:c7:32,bridge_name='br-int',has_traffic_filtering=True,id=93f5d287-34ec-424d-8776-df002083762e,network=Network(ae1cb66c-0987-4156-9bdb-cb2a08957306),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93f5d287-34')
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.774 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.823 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.823 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.824 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No VIF found with MAC fa:16:3e:00:c7:32, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.824 226326 INFO nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Using config drive
Jan 26 10:12:34 compute-1 nova_compute[226322]: 2026-01-26 10:12:34.863 226326 DEBUG nova.storage.rbd_utils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 17d96116-83a4-40d0-9dcd-bd5072238621_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:12:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:35.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:12:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:35.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:12:35 compute-1 nova_compute[226322]: 2026-01-26 10:12:35.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:12:35 compute-1 nova_compute[226322]: 2026-01-26 10:12:35.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:12:35 compute-1 nova_compute[226322]: 2026-01-26 10:12:35.963 226326 DEBUG nova.network.neutron [req-c600f5a7-12f3-4785-b9bb-0f9ceac097d2 req-d2ea289b-91f1-4253-a9f7-4aa3ee7a8eed b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Updated VIF entry in instance network info cache for port 93f5d287-34ec-424d-8776-df002083762e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 10:12:35 compute-1 nova_compute[226322]: 2026-01-26 10:12:35.963 226326 DEBUG nova.network.neutron [req-c600f5a7-12f3-4785-b9bb-0f9ceac097d2 req-d2ea289b-91f1-4253-a9f7-4aa3ee7a8eed b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Updating instance_info_cache with network_info: [{"id": "93f5d287-34ec-424d-8776-df002083762e", "address": "fa:16:3e:00:c7:32", "network": {"id": "ae1cb66c-0987-4156-9bdb-cb2a08957306", "bridge": "br-int", "label": "tempest-network-smoke--514366077", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93f5d287-34", "ovs_interfaceid": "93f5d287-34ec-424d-8776-df002083762e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:12:36 compute-1 ceph-mon[80107]: pgmap v910: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 49 KiB/s rd, 1.8 MiB/s wr, 80 op/s
Jan 26 10:12:36 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:12:36 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2307260624' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:12:36 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2126813450' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:12:36 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/435654853' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:12:36 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3132034729' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:12:36 compute-1 podman[234914]: 2026-01-26 10:12:36.332143102 +0000 UTC m=+0.101809412 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 10:12:36 compute-1 nova_compute[226322]: 2026-01-26 10:12:36.343 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:12:36 compute-1 nova_compute[226322]: 2026-01-26 10:12:36.343 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:12:36 compute-1 nova_compute[226322]: 2026-01-26 10:12:36.344 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:12:36 compute-1 nova_compute[226322]: 2026-01-26 10:12:36.344 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:12:36 compute-1 nova_compute[226322]: 2026-01-26 10:12:36.345 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:12:36 compute-1 nova_compute[226322]: 2026-01-26 10:12:36.373 226326 DEBUG oslo_concurrency.lockutils [req-c600f5a7-12f3-4785-b9bb-0f9ceac097d2 req-d2ea289b-91f1-4253-a9f7-4aa3ee7a8eed b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Releasing lock "refresh_cache-17d96116-83a4-40d0-9dcd-bd5072238621" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 10:12:36 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:12:36 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3576426832' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:12:36 compute-1 nova_compute[226322]: 2026-01-26 10:12:36.812 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:12:36 compute-1 nova_compute[226322]: 2026-01-26 10:12:36.868 226326 DEBUG nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] skipping disk for instance-00000007 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Jan 26 10:12:36 compute-1 nova_compute[226322]: 2026-01-26 10:12:36.868 226326 DEBUG nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] skipping disk for instance-00000007 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Jan 26 10:12:36 compute-1 ceph-mon[80107]: pgmap v911: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 50 KiB/s rd, 1.8 MiB/s wr, 82 op/s
Jan 26 10:12:36 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3576426832' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:12:37 compute-1 nova_compute[226322]: 2026-01-26 10:12:37.013 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:12:37 compute-1 nova_compute[226322]: 2026-01-26 10:12:37.014 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4826MB free_disk=59.921871185302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:12:37 compute-1 nova_compute[226322]: 2026-01-26 10:12:37.014 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:12:37 compute-1 nova_compute[226322]: 2026-01-26 10:12:37.014 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:12:37 compute-1 nova_compute[226322]: 2026-01-26 10:12:37.108 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Instance 17d96116-83a4-40d0-9dcd-bd5072238621 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 10:12:37 compute-1 nova_compute[226322]: 2026-01-26 10:12:37.109 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:12:37 compute-1 nova_compute[226322]: 2026-01-26 10:12:37.109 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:12:37 compute-1 nova_compute[226322]: 2026-01-26 10:12:37.127 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing inventories for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 26 10:12:37 compute-1 nova_compute[226322]: 2026-01-26 10:12:37.149 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating ProviderTree inventory for provider d06842a0-5d13-4573-bb78-d433bbb380e4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 26 10:12:37 compute-1 nova_compute[226322]: 2026-01-26 10:12:37.150 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating inventory in ProviderTree for provider d06842a0-5d13-4573-bb78-d433bbb380e4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 10:12:37 compute-1 nova_compute[226322]: 2026-01-26 10:12:37.166 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing aggregate associations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 26 10:12:37 compute-1 nova_compute[226322]: 2026-01-26 10:12:37.184 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing trait associations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4, traits: HW_CPU_X86_CLMUL,HW_CPU_X86_SSE,HW_CPU_X86_AVX2,HW_CPU_X86_SHA,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_FMA3,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SVM,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 26 10:12:37 compute-1 nova_compute[226322]: 2026-01-26 10:12:37.216 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:12:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:12:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:37.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:12:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:12:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:37.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:12:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:12:37 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1035487032' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:12:37 compute-1 nova_compute[226322]: 2026-01-26 10:12:37.644 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:12:37 compute-1 nova_compute[226322]: 2026-01-26 10:12:37.650 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:12:37 compute-1 nova_compute[226322]: 2026-01-26 10:12:37.676 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:12:37 compute-1 nova_compute[226322]: 2026-01-26 10:12:37.810 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:12:37 compute-1 nova_compute[226322]: 2026-01-26 10:12:37.811 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.796s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:12:38 compute-1 nova_compute[226322]: 2026-01-26 10:12:38.039 226326 INFO nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Creating config drive at /var/lib/nova/instances/17d96116-83a4-40d0-9dcd-bd5072238621/disk.config
Jan 26 10:12:38 compute-1 nova_compute[226322]: 2026-01-26 10:12:38.045 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/17d96116-83a4-40d0-9dcd-bd5072238621/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4941ruca execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:12:38 compute-1 nova_compute[226322]: 2026-01-26 10:12:38.173 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/17d96116-83a4-40d0-9dcd-bd5072238621/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4941ruca" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:12:38 compute-1 nova_compute[226322]: 2026-01-26 10:12:38.204 226326 DEBUG nova.storage.rbd_utils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 17d96116-83a4-40d0-9dcd-bd5072238621_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:12:38 compute-1 nova_compute[226322]: 2026-01-26 10:12:38.211 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/17d96116-83a4-40d0-9dcd-bd5072238621/disk.config 17d96116-83a4-40d0-9dcd-bd5072238621_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:12:38 compute-1 nova_compute[226322]: 2026-01-26 10:12:38.461 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/17d96116-83a4-40d0-9dcd-bd5072238621/disk.config 17d96116-83a4-40d0-9dcd-bd5072238621_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.250s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:12:38 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1035487032' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:12:38 compute-1 nova_compute[226322]: 2026-01-26 10:12:38.462 226326 INFO nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Deleting local config drive /var/lib/nova/instances/17d96116-83a4-40d0-9dcd-bd5072238621/disk.config because it was imported into RBD.
Jan 26 10:12:38 compute-1 systemd[1]: Starting libvirt secret daemon...
Jan 26 10:12:38 compute-1 systemd[1]: Started libvirt secret daemon.
Jan 26 10:12:38 compute-1 kernel: tap93f5d287-34: entered promiscuous mode
Jan 26 10:12:38 compute-1 NetworkManager[49073]: <info>  [1769422358.5543] manager: (tap93f5d287-34): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Jan 26 10:12:38 compute-1 ovn_controller[133670]: 2026-01-26T10:12:38Z|00048|binding|INFO|Claiming lport 93f5d287-34ec-424d-8776-df002083762e for this chassis.
Jan 26 10:12:38 compute-1 ovn_controller[133670]: 2026-01-26T10:12:38Z|00049|binding|INFO|93f5d287-34ec-424d-8776-df002083762e: Claiming fa:16:3e:00:c7:32 10.100.0.27
Jan 26 10:12:38 compute-1 nova_compute[226322]: 2026-01-26 10:12:38.554 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.566 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:00:c7:32 10.100.0.27'], port_security=['fa:16:3e:00:c7:32 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': '17d96116-83a4-40d0-9dcd-bd5072238621', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae1cb66c-0987-4156-9bdb-cb2a08957306', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b040f826-b5f4-4627-aa36-62cc16e0727f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=73bcc0f9-41ce-47a1-86a1-53fe1b73bb31, chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], logical_port=93f5d287-34ec-424d-8776-df002083762e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.567 143326 INFO neutron.agent.ovn.metadata.agent [-] Port 93f5d287-34ec-424d-8776-df002083762e in datapath ae1cb66c-0987-4156-9bdb-cb2a08957306 bound to our chassis
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.569 143326 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ae1cb66c-0987-4156-9bdb-cb2a08957306
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.579 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[9761b12b-3f43-49ca-8e22-96f5049eda59]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.579 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapae1cb66c-01 in ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.581 229912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapae1cb66c-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.581 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[94d46c7d-9afb-4ce8-a066-d387fe0d1d10]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.583 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[126a6126-52d3-444d-893e-6176b0ec86b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:12:38 compute-1 systemd-udevd[235059]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.594 143615 DEBUG oslo.privsep.daemon [-] privsep: reply[d0ca7631-4f9f-48cf-b9c3-0b42ac58ed78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:12:38 compute-1 systemd-machined[194876]: New machine qemu-3-instance-00000007.
Jan 26 10:12:38 compute-1 nova_compute[226322]: 2026-01-26 10:12:38.598 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:38 compute-1 systemd[1]: Started Virtual Machine qemu-3-instance-00000007.
Jan 26 10:12:38 compute-1 NetworkManager[49073]: <info>  [1769422358.6048] device (tap93f5d287-34): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 10:12:38 compute-1 NetworkManager[49073]: <info>  [1769422358.6059] device (tap93f5d287-34): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 10:12:38 compute-1 nova_compute[226322]: 2026-01-26 10:12:38.605 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:38 compute-1 ovn_controller[133670]: 2026-01-26T10:12:38Z|00050|binding|INFO|Setting lport 93f5d287-34ec-424d-8776-df002083762e ovn-installed in OVS
Jan 26 10:12:38 compute-1 ovn_controller[133670]: 2026-01-26T10:12:38Z|00051|binding|INFO|Setting lport 93f5d287-34ec-424d-8776-df002083762e up in Southbound
Jan 26 10:12:38 compute-1 nova_compute[226322]: 2026-01-26 10:12:38.607 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.618 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[d68a26a7-afe2-4dde-b27e-1efeb027b076]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.646 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[4a661095-829e-4f7a-a19a-8738785eee4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.652 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[0f12ea25-d8e3-48b7-8e16-456ca9b59e8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:12:38 compute-1 NetworkManager[49073]: <info>  [1769422358.6541] manager: (tapae1cb66c-00): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.678 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[48bf061d-52a7-4446-83e2-c1df9248b330]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.681 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[d51a5050-3321-4852-b87c-82754a6e424e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:12:38 compute-1 NetworkManager[49073]: <info>  [1769422358.6992] device (tapae1cb66c-00): carrier: link connected
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.703 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[8e845334-24ee-415f-b949-9f1084d8fde5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.720 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[633f0268-fcda-4fea-a44f-0dc1586e74ec]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapae1cb66c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:97:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431926, 'reachable_time': 27698, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235090, 'error': None, 'target': 'ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.732 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[e192661e-311f-4911-a6a9-1f12bd428398]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2f:9770'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431926, 'tstamp': 431926}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235091, 'error': None, 'target': 'ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.745 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[7c9fc857-dfcb-4c45-b1ec-49b792daa337]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapae1cb66c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:97:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431926, 'reachable_time': 27698, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235092, 'error': None, 'target': 'ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.770 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[ece01b40-e012-4877-a327-ae8e5503bb06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.831 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[71459863-83c7-4544-bff5-dd6e0904fa0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.832 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapae1cb66c-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.833 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.833 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapae1cb66c-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:12:38 compute-1 nova_compute[226322]: 2026-01-26 10:12:38.835 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:38 compute-1 NetworkManager[49073]: <info>  [1769422358.8364] manager: (tapae1cb66c-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Jan 26 10:12:38 compute-1 kernel: tapae1cb66c-00: entered promiscuous mode
Jan 26 10:12:38 compute-1 nova_compute[226322]: 2026-01-26 10:12:38.839 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.840 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapae1cb66c-00, col_values=(('external_ids', {'iface-id': 'eff5217a-1c96-40b0-bc7b-e1d3937349a6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:12:38 compute-1 ovn_controller[133670]: 2026-01-26T10:12:38Z|00052|binding|INFO|Releasing lport eff5217a-1c96-40b0-bc7b-e1d3937349a6 from this chassis (sb_readonly=0)
Jan 26 10:12:38 compute-1 nova_compute[226322]: 2026-01-26 10:12:38.842 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:38 compute-1 nova_compute[226322]: 2026-01-26 10:12:38.868 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.869 143326 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ae1cb66c-0987-4156-9bdb-cb2a08957306.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ae1cb66c-0987-4156-9bdb-cb2a08957306.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.870 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[8d7f018f-0675-46de-9496-350b0537ec72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.871 143326 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: global
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]:     log         /dev/log local0 debug
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]:     log-tag     haproxy-metadata-proxy-ae1cb66c-0987-4156-9bdb-cb2a08957306
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]:     user        root
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]:     group       root
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]:     maxconn     1024
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]:     pidfile     /var/lib/neutron/external/pids/ae1cb66c-0987-4156-9bdb-cb2a08957306.pid.haproxy
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]:     daemon
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: defaults
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]:     log global
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]:     mode http
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]:     option httplog
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]:     option dontlognull
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]:     option http-server-close
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]:     option forwardfor
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]:     retries                 3
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]:     timeout http-request    30s
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]:     timeout connect         30s
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]:     timeout client          32s
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]:     timeout server          32s
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]:     timeout http-keep-alive 30s
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: listen listener
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]:     bind 169.254.169.254:80
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]:     http-request add-header X-OVN-Network-ID ae1cb66c-0987-4156-9bdb-cb2a08957306
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 26 10:12:38 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.871 143326 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306', 'env', 'PROCESS_TAG=haproxy-ae1cb66c-0987-4156-9bdb-cb2a08957306', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ae1cb66c-0987-4156-9bdb-cb2a08957306.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 26 10:12:38 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:12:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:12:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:12:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:12:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:39 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:12:39 compute-1 nova_compute[226322]: 2026-01-26 10:12:39.202 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422359.202343, 17d96116-83a4-40d0-9dcd-bd5072238621 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 10:12:39 compute-1 nova_compute[226322]: 2026-01-26 10:12:39.203 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] VM Started (Lifecycle Event)
Jan 26 10:12:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:39.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:39 compute-1 podman[235166]: 2026-01-26 10:12:39.279586989 +0000 UTC m=+0.019813616 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 26 10:12:39 compute-1 nova_compute[226322]: 2026-01-26 10:12:39.383 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 10:12:39 compute-1 nova_compute[226322]: 2026-01-26 10:12:39.388 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422359.2032652, 17d96116-83a4-40d0-9dcd-bd5072238621 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 10:12:39 compute-1 nova_compute[226322]: 2026-01-26 10:12:39.388 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] VM Paused (Lifecycle Event)
Jan 26 10:12:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:39.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:39 compute-1 nova_compute[226322]: 2026-01-26 10:12:39.737 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:39 compute-1 nova_compute[226322]: 2026-01-26 10:12:39.777 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:39 compute-1 ceph-mon[80107]: pgmap v912: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 29 op/s
Jan 26 10:12:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:12:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:41.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:12:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:12:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:41.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:12:41 compute-1 podman[235166]: 2026-01-26 10:12:41.4825926 +0000 UTC m=+2.222819207 container create 4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 26 10:12:42 compute-1 systemd[1]: Started libpod-conmon-4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f.scope.
Jan 26 10:12:42 compute-1 systemd[1]: Started libcrun container.
Jan 26 10:12:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4db7298031f92053143f67de5fae7608006766babe131c36a52e198214e577e1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 10:12:42 compute-1 podman[235166]: 2026-01-26 10:12:42.416235638 +0000 UTC m=+3.156462265 container init 4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 10:12:42 compute-1 podman[235166]: 2026-01-26 10:12:42.422202498 +0000 UTC m=+3.162429125 container start 4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 26 10:12:42 compute-1 neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306[235183]: [NOTICE]   (235187) : New worker (235189) forked
Jan 26 10:12:42 compute-1 neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306[235183]: [NOTICE]   (235187) : Loading success.
Jan 26 10:12:42 compute-1 nova_compute[226322]: 2026-01-26 10:12:42.471 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 10:12:42 compute-1 nova_compute[226322]: 2026-01-26 10:12:42.474 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 10:12:42 compute-1 ceph-mon[80107]: pgmap v913: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 29 op/s
Jan 26 10:12:42 compute-1 nova_compute[226322]: 2026-01-26 10:12:42.537 226326 DEBUG nova.compute.manager [req-17b4c832-3f09-492e-b91e-bf50a5c874b4 req-6c2353ff-70ba-4795-a673-cc8d6f07bbe0 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Received event network-vif-plugged-93f5d287-34ec-424d-8776-df002083762e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:12:42 compute-1 nova_compute[226322]: 2026-01-26 10:12:42.538 226326 DEBUG oslo_concurrency.lockutils [req-17b4c832-3f09-492e-b91e-bf50a5c874b4 req-6c2353ff-70ba-4795-a673-cc8d6f07bbe0 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:12:42 compute-1 nova_compute[226322]: 2026-01-26 10:12:42.538 226326 DEBUG oslo_concurrency.lockutils [req-17b4c832-3f09-492e-b91e-bf50a5c874b4 req-6c2353ff-70ba-4795-a673-cc8d6f07bbe0 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:12:42 compute-1 nova_compute[226322]: 2026-01-26 10:12:42.538 226326 DEBUG oslo_concurrency.lockutils [req-17b4c832-3f09-492e-b91e-bf50a5c874b4 req-6c2353ff-70ba-4795-a673-cc8d6f07bbe0 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:12:42 compute-1 nova_compute[226322]: 2026-01-26 10:12:42.539 226326 DEBUG nova.compute.manager [req-17b4c832-3f09-492e-b91e-bf50a5c874b4 req-6c2353ff-70ba-4795-a673-cc8d6f07bbe0 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Processing event network-vif-plugged-93f5d287-34ec-424d-8776-df002083762e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 26 10:12:42 compute-1 nova_compute[226322]: 2026-01-26 10:12:42.539 226326 DEBUG nova.compute.manager [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 26 10:12:42 compute-1 nova_compute[226322]: 2026-01-26 10:12:42.550 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 26 10:12:42 compute-1 nova_compute[226322]: 2026-01-26 10:12:42.553 226326 INFO nova.virt.libvirt.driver [-] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Instance spawned successfully.
Jan 26 10:12:42 compute-1 nova_compute[226322]: 2026-01-26 10:12:42.554 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 26 10:12:42 compute-1 nova_compute[226322]: 2026-01-26 10:12:42.575 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 10:12:42 compute-1 nova_compute[226322]: 2026-01-26 10:12:42.576 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422362.5438752, 17d96116-83a4-40d0-9dcd-bd5072238621 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 10:12:42 compute-1 nova_compute[226322]: 2026-01-26 10:12:42.576 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] VM Resumed (Lifecycle Event)
Jan 26 10:12:42 compute-1 nova_compute[226322]: 2026-01-26 10:12:42.643 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:12:42 compute-1 nova_compute[226322]: 2026-01-26 10:12:42.648 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:12:42 compute-1 nova_compute[226322]: 2026-01-26 10:12:42.649 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:12:42 compute-1 nova_compute[226322]: 2026-01-26 10:12:42.650 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:12:42 compute-1 nova_compute[226322]: 2026-01-26 10:12:42.650 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:12:42 compute-1 nova_compute[226322]: 2026-01-26 10:12:42.651 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:12:42 compute-1 nova_compute[226322]: 2026-01-26 10:12:42.655 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 10:12:42 compute-1 nova_compute[226322]: 2026-01-26 10:12:42.659 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 10:12:42 compute-1 nova_compute[226322]: 2026-01-26 10:12:42.696 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 10:12:42 compute-1 nova_compute[226322]: 2026-01-26 10:12:42.770 226326 INFO nova.compute.manager [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Took 15.22 seconds to spawn the instance on the hypervisor.
Jan 26 10:12:42 compute-1 nova_compute[226322]: 2026-01-26 10:12:42.771 226326 DEBUG nova.compute.manager [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 10:12:43 compute-1 nova_compute[226322]: 2026-01-26 10:12:43.063 226326 INFO nova.compute.manager [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Took 16.53 seconds to build instance.
Jan 26 10:12:43 compute-1 nova_compute[226322]: 2026-01-26 10:12:43.185 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.712s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:12:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:12:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:43.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:12:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:12:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:43.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:12:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:12:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:44 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:12:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:44 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:12:44 compute-1 sudo[235201]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:12:44 compute-1 sudo[235201]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:12:44 compute-1 sudo[235201]: pam_unix(sudo:session): session closed for user root
Jan 26 10:12:44 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:44 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:12:44 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:12:44 compute-1 sshd-session[235199]: Invalid user admin from 178.62.249.31 port 55546
Jan 26 10:12:44 compute-1 nova_compute[226322]: 2026-01-26 10:12:44.642 226326 DEBUG nova.compute.manager [req-66f1babd-ff8a-4527-8bf2-b6d519e66785 req-9a1394a4-7c30-4679-976a-bc00194b2799 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Received event network-vif-plugged-93f5d287-34ec-424d-8776-df002083762e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:12:44 compute-1 nova_compute[226322]: 2026-01-26 10:12:44.642 226326 DEBUG oslo_concurrency.lockutils [req-66f1babd-ff8a-4527-8bf2-b6d519e66785 req-9a1394a4-7c30-4679-976a-bc00194b2799 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:12:44 compute-1 nova_compute[226322]: 2026-01-26 10:12:44.642 226326 DEBUG oslo_concurrency.lockutils [req-66f1babd-ff8a-4527-8bf2-b6d519e66785 req-9a1394a4-7c30-4679-976a-bc00194b2799 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:12:44 compute-1 nova_compute[226322]: 2026-01-26 10:12:44.643 226326 DEBUG oslo_concurrency.lockutils [req-66f1babd-ff8a-4527-8bf2-b6d519e66785 req-9a1394a4-7c30-4679-976a-bc00194b2799 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:12:44 compute-1 nova_compute[226322]: 2026-01-26 10:12:44.643 226326 DEBUG nova.compute.manager [req-66f1babd-ff8a-4527-8bf2-b6d519e66785 req-9a1394a4-7c30-4679-976a-bc00194b2799 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] No waiting events found dispatching network-vif-plugged-93f5d287-34ec-424d-8776-df002083762e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 10:12:44 compute-1 nova_compute[226322]: 2026-01-26 10:12:44.643 226326 WARNING nova.compute.manager [req-66f1babd-ff8a-4527-8bf2-b6d519e66785 req-9a1394a4-7c30-4679-976a-bc00194b2799 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Received unexpected event network-vif-plugged-93f5d287-34ec-424d-8776-df002083762e for instance with vm_state active and task_state None.
Jan 26 10:12:44 compute-1 nova_compute[226322]: 2026-01-26 10:12:44.739 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:44 compute-1 sshd-session[235199]: Connection closed by invalid user admin 178.62.249.31 port 55546 [preauth]
Jan 26 10:12:44 compute-1 nova_compute[226322]: 2026-01-26 10:12:44.778 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:45.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:12:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:45.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:12:45 compute-1 ceph-mon[80107]: pgmap v914: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 39 op/s
Jan 26 10:12:45 compute-1 ceph-mon[80107]: pgmap v915: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 7.9 KiB/s rd, 23 KiB/s wr, 12 op/s
Jan 26 10:12:46 compute-1 ceph-mon[80107]: pgmap v916: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.6 MiB/s rd, 23 KiB/s wr, 66 op/s
Jan 26 10:12:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:12:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:47.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:12:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:47.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:48 compute-1 ceph-mon[80107]: pgmap v917: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.6 MiB/s rd, 13 KiB/s wr, 64 op/s
Jan 26 10:12:48 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:12:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:12:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:12:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:12:49 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:49 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:12:49 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:12:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:12:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:49.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:12:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:49.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:49 compute-1 nova_compute[226322]: 2026-01-26 10:12:49.741 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:49 compute-1 nova_compute[226322]: 2026-01-26 10:12:49.780 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:50 compute-1 ceph-mon[80107]: pgmap v918: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.6 MiB/s rd, 13 KiB/s wr, 64 op/s
Jan 26 10:12:50 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/101250 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 10:12:50 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [ALERT] 025/101250 (4) : backend 'backend' has no server available!
Jan 26 10:12:51 compute-1 podman[235229]: 2026-01-26 10:12:51.335303744 +0000 UTC m=+0.106163466 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 10:12:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:51.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:51.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:52 compute-1 ceph-mon[80107]: pgmap v919: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Jan 26 10:12:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:53.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:53.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:53.936 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:12:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:53.938 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:12:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:12:53.939 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:12:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:12:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:12:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:12:54 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:54 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:12:54 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:12:54 compute-1 nova_compute[226322]: 2026-01-26 10:12:54.743 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:54 compute-1 nova_compute[226322]: 2026-01-26 10:12:54.782 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:54 compute-1 ceph-mon[80107]: pgmap v920: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 64 op/s
Jan 26 10:12:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:55.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:12:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:55.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:12:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:56 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:12:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:56 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:12:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:56 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:12:56 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:56 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:12:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:12:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:57.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:12:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:12:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:57.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:12:57 compute-1 ceph-mon[80107]: pgmap v921: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 2.3 KiB/s wr, 65 op/s
Jan 26 10:12:57 compute-1 ovn_controller[133670]: 2026-01-26T10:12:57Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:00:c7:32 10.100.0.27
Jan 26 10:12:57 compute-1 ovn_controller[133670]: 2026-01-26T10:12:57Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:00:c7:32 10.100.0.27
Jan 26 10:12:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:59.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:12:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:12:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:59.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:12:59 compute-1 nova_compute[226322]: 2026-01-26 10:12:59.745 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:59 compute-1 nova_compute[226322]: 2026-01-26 10:12:59.783 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:12:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:59 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:12:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:59 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:12:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:59 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:12:59 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:59 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:13:01 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:13:01 compute-1 ceph-mon[80107]: pgmap v922: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 314 KiB/s rd, 2.3 KiB/s wr, 11 op/s
Jan 26 10:13:01 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/1241123311' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:13:01 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/1241123311' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:13:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:01.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:13:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:01.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:13:02 compute-1 ceph-mon[80107]: pgmap v923: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 314 KiB/s rd, 2.3 KiB/s wr, 11 op/s
Jan 26 10:13:02 compute-1 ceph-mon[80107]: pgmap v924: 353 pgs: 353 active+clean; 193 MiB data, 352 MiB used, 60 GiB / 60 GiB avail; 602 KiB/s rd, 2.1 MiB/s wr, 71 op/s
Jan 26 10:13:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:03.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:03.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:13:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:13:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:13:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:13:04 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:04 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:13:04 compute-1 sudo[235257]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:13:04 compute-1 sudo[235257]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:13:04 compute-1 sudo[235257]: pam_unix(sudo:session): session closed for user root
Jan 26 10:13:04 compute-1 nova_compute[226322]: 2026-01-26 10:13:04.748 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:04 compute-1 nova_compute[226322]: 2026-01-26 10:13:04.786 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:04 compute-1 ceph-mon[80107]: pgmap v925: 353 pgs: 353 active+clean; 193 MiB data, 352 MiB used, 60 GiB / 60 GiB avail; 291 KiB/s rd, 2.1 MiB/s wr, 61 op/s
Jan 26 10:13:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:13:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:05.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:13:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:05.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:06 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:13:06 compute-1 ceph-mon[80107]: pgmap v926: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 339 KiB/s rd, 2.1 MiB/s wr, 96 op/s
Jan 26 10:13:07 compute-1 sshd-session[235284]: Invalid user centos from 104.248.198.71 port 58300
Jan 26 10:13:07 compute-1 podman[235286]: 2026-01-26 10:13:07.291163167 +0000 UTC m=+0.090285314 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3)
Jan 26 10:13:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:07.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:07.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:07 compute-1 sshd-session[235284]: Connection closed by invalid user centos 104.248.198.71 port 58300 [preauth]
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.642287) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422388642312, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 1703, "num_deletes": 255, "total_data_size": 4470135, "memory_usage": 4524432, "flush_reason": "Manual Compaction"}
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422388667867, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 2876864, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28372, "largest_seqno": 30070, "table_properties": {"data_size": 2869763, "index_size": 4108, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1925, "raw_key_size": 14635, "raw_average_key_size": 19, "raw_value_size": 2855521, "raw_average_value_size": 3807, "num_data_blocks": 180, "num_entries": 750, "num_filter_entries": 750, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769422248, "oldest_key_time": 1769422248, "file_creation_time": 1769422388, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 25620 microseconds, and 6958 cpu microseconds.
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.667903) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 2876864 bytes OK
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.667926) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.669307) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.669319) EVENT_LOG_v1 {"time_micros": 1769422388669315, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.669581) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 4462339, prev total WAL file size 4462339, number of live WAL files 2.
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.670863) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353031' seq:72057594037927935, type:22 .. '6C6F676D00373532' seq:0, type:0; will stop at (end)
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(2809KB)], [54(13MB)]
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422388670983, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 17431016, "oldest_snapshot_seqno": -1}
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 6045 keys, 17284848 bytes, temperature: kUnknown
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422388814073, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 17284848, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17240868, "index_size": 27742, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15173, "raw_key_size": 153717, "raw_average_key_size": 25, "raw_value_size": 17128288, "raw_average_value_size": 2833, "num_data_blocks": 1138, "num_entries": 6045, "num_filter_entries": 6045, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769422388, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.814410) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 17284848 bytes
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.816253) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 121.7 rd, 120.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.7, 13.9 +0.0 blob) out(16.5 +0.0 blob), read-write-amplify(12.1) write-amplify(6.0) OK, records in: 6573, records dropped: 528 output_compression: NoCompression
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.816278) EVENT_LOG_v1 {"time_micros": 1769422388816265, "job": 32, "event": "compaction_finished", "compaction_time_micros": 143242, "compaction_time_cpu_micros": 60715, "output_level": 6, "num_output_files": 1, "total_output_size": 17284848, "num_input_records": 6573, "num_output_records": 6045, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422388816888, "job": 32, "event": "table_file_deletion", "file_number": 56}
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422388819421, "job": 32, "event": "table_file_deletion", "file_number": 54}
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.670731) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.819539) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.819546) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.819548) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.819550) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:13:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.819552) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:13:08 compute-1 ceph-mon[80107]: pgmap v927: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 336 KiB/s rd, 2.1 MiB/s wr, 95 op/s
Jan 26 10:13:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:13:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:13:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:13:09 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:09 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:13:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:09.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:09.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:09 compute-1 nova_compute[226322]: 2026-01-26 10:13:09.750 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:09 compute-1 nova_compute[226322]: 2026-01-26 10:13:09.790 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:11 compute-1 ceph-mon[80107]: pgmap v928: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 336 KiB/s rd, 2.1 MiB/s wr, 95 op/s
Jan 26 10:13:11 compute-1 sshd-session[235316]: Invalid user admin from 152.42.136.110 port 49228
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.058 226326 DEBUG oslo_concurrency.lockutils [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "17d96116-83a4-40d0-9dcd-bd5072238621" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.058 226326 DEBUG oslo_concurrency.lockutils [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.059 226326 DEBUG oslo_concurrency.lockutils [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.059 226326 DEBUG oslo_concurrency.lockutils [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.059 226326 DEBUG oslo_concurrency.lockutils [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.061 226326 INFO nova.compute.manager [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Terminating instance
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.062 226326 DEBUG nova.compute.manager [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 10:13:11 compute-1 sshd-session[235316]: Connection closed by invalid user admin 152.42.136.110 port 49228 [preauth]
Jan 26 10:13:11 compute-1 kernel: tap93f5d287-34 (unregistering): left promiscuous mode
Jan 26 10:13:11 compute-1 NetworkManager[49073]: <info>  [1769422391.1203] device (tap93f5d287-34): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.129 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:11 compute-1 ovn_controller[133670]: 2026-01-26T10:13:11Z|00053|binding|INFO|Releasing lport 93f5d287-34ec-424d-8776-df002083762e from this chassis (sb_readonly=0)
Jan 26 10:13:11 compute-1 ovn_controller[133670]: 2026-01-26T10:13:11Z|00054|binding|INFO|Setting lport 93f5d287-34ec-424d-8776-df002083762e down in Southbound
Jan 26 10:13:11 compute-1 ovn_controller[133670]: 2026-01-26T10:13:11Z|00055|binding|INFO|Removing iface tap93f5d287-34 ovn-installed in OVS
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.133 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:11 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.137 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:00:c7:32 10.100.0.27'], port_security=['fa:16:3e:00:c7:32 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': '17d96116-83a4-40d0-9dcd-bd5072238621', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae1cb66c-0987-4156-9bdb-cb2a08957306', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b040f826-b5f4-4627-aa36-62cc16e0727f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=73bcc0f9-41ce-47a1-86a1-53fe1b73bb31, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], logical_port=93f5d287-34ec-424d-8776-df002083762e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 10:13:11 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.138 143326 INFO neutron.agent.ovn.metadata.agent [-] Port 93f5d287-34ec-424d-8776-df002083762e in datapath ae1cb66c-0987-4156-9bdb-cb2a08957306 unbound from our chassis
Jan 26 10:13:11 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.139 143326 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ae1cb66c-0987-4156-9bdb-cb2a08957306, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 10:13:11 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.142 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[f77db2ff-4906-4e3c-a106-1186c31bab7c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:13:11 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.143 143326 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306 namespace which is not needed anymore
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.150 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:11 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Deactivated successfully.
Jan 26 10:13:11 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Consumed 14.372s CPU time.
Jan 26 10:13:11 compute-1 systemd-machined[194876]: Machine qemu-3-instance-00000007 terminated.
Jan 26 10:13:11 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.299 226326 INFO nova.virt.libvirt.driver [-] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Instance destroyed successfully.
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.299 226326 DEBUG nova.objects.instance [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'resources' on Instance uuid 17d96116-83a4-40d0-9dcd-bd5072238621 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 10:13:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:11.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 10:13:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:11.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 10:13:11 compute-1 neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306[235183]: [NOTICE]   (235187) : haproxy version is 2.8.14-c23fe91
Jan 26 10:13:11 compute-1 neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306[235183]: [NOTICE]   (235187) : path to executable is /usr/sbin/haproxy
Jan 26 10:13:11 compute-1 neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306[235183]: [WARNING]  (235187) : Exiting Master process...
Jan 26 10:13:11 compute-1 neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306[235183]: [ALERT]    (235187) : Current worker (235189) exited with code 143 (Terminated)
Jan 26 10:13:11 compute-1 neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306[235183]: [WARNING]  (235187) : All workers exited. Exiting... (0)
Jan 26 10:13:11 compute-1 systemd[1]: libpod-4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f.scope: Deactivated successfully.
Jan 26 10:13:11 compute-1 podman[235344]: 2026-01-26 10:13:11.651549724 +0000 UTC m=+0.406588575 container died 4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.693 226326 DEBUG nova.virt.libvirt.vif [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T10:12:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-200115615',display_name='tempest-TestNetworkBasicOps-server-200115615',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-200115615',id=7,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNdcGSrtCXDvk89I0Vz9B7It1mSwlDvkZdMdWLCvoVbAe00VgK1axvS1LkIm+2Wq3uLZqcGzySpB1p+E5CAX8FNwfd40H16DcgIFt/DJC5r2xLViVsjIsqcjjLML0XecLw==',key_name='tempest-TestNetworkBasicOps-1792120849',keypairs=<?>,launch_index=0,launched_at=2026-01-26T10:12:42Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-0x8cj090',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T10:12:42Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=17d96116-83a4-40d0-9dcd-bd5072238621,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "93f5d287-34ec-424d-8776-df002083762e", "address": "fa:16:3e:00:c7:32", "network": {"id": "ae1cb66c-0987-4156-9bdb-cb2a08957306", "bridge": "br-int", "label": "tempest-network-smoke--514366077", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93f5d287-34", "ovs_interfaceid": "93f5d287-34ec-424d-8776-df002083762e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.693 226326 DEBUG nova.network.os_vif_util [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "93f5d287-34ec-424d-8776-df002083762e", "address": "fa:16:3e:00:c7:32", "network": {"id": "ae1cb66c-0987-4156-9bdb-cb2a08957306", "bridge": "br-int", "label": "tempest-network-smoke--514366077", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93f5d287-34", "ovs_interfaceid": "93f5d287-34ec-424d-8776-df002083762e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.695 226326 DEBUG nova.network.os_vif_util [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:c7:32,bridge_name='br-int',has_traffic_filtering=True,id=93f5d287-34ec-424d-8776-df002083762e,network=Network(ae1cb66c-0987-4156-9bdb-cb2a08957306),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93f5d287-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.696 226326 DEBUG os_vif [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:c7:32,bridge_name='br-int',has_traffic_filtering=True,id=93f5d287-34ec-424d-8776-df002083762e,network=Network(ae1cb66c-0987-4156-9bdb-cb2a08957306),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93f5d287-34') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.702 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.703 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap93f5d287-34, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.707 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.709 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.713 226326 INFO os_vif [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:c7:32,bridge_name='br-int',has_traffic_filtering=True,id=93f5d287-34ec-424d-8776-df002083762e,network=Network(ae1cb66c-0987-4156-9bdb-cb2a08957306),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93f5d287-34')
Jan 26 10:13:11 compute-1 systemd[1]: var-lib-containers-storage-overlay-4db7298031f92053143f67de5fae7608006766babe131c36a52e198214e577e1-merged.mount: Deactivated successfully.
Jan 26 10:13:11 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f-userdata-shm.mount: Deactivated successfully.
Jan 26 10:13:11 compute-1 podman[235344]: 2026-01-26 10:13:11.72388907 +0000 UTC m=+0.478927901 container cleanup 4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 26 10:13:11 compute-1 systemd[1]: libpod-conmon-4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f.scope: Deactivated successfully.
Jan 26 10:13:11 compute-1 podman[235402]: 2026-01-26 10:13:11.903967471 +0000 UTC m=+0.152109277 container remove 4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 10:13:11 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.910 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[28f549a5-4cd2-4576-937e-8a60af3123a8]: (4, ('Mon Jan 26 10:13:11 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306 (4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f)\n4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f\nMon Jan 26 10:13:11 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306 (4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f)\n4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:13:11 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.912 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[9b705132-3ae3-4b34-81de-5a625ae13bce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:13:11 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.914 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapae1cb66c-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.916 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:11 compute-1 kernel: tapae1cb66c-00: left promiscuous mode
Jan 26 10:13:11 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.924 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[662775cd-bf9f-4a26-9e37-f396aa217a87]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.927 226326 DEBUG nova.compute.manager [req-ff4c01a0-7ff1-4dc8-bf34-9c642c122574 req-e19dc28a-b541-44fc-863a-94c282fb2c53 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Received event network-vif-unplugged-93f5d287-34ec-424d-8776-df002083762e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.928 226326 DEBUG oslo_concurrency.lockutils [req-ff4c01a0-7ff1-4dc8-bf34-9c642c122574 req-e19dc28a-b541-44fc-863a-94c282fb2c53 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.928 226326 DEBUG oslo_concurrency.lockutils [req-ff4c01a0-7ff1-4dc8-bf34-9c642c122574 req-e19dc28a-b541-44fc-863a-94c282fb2c53 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.928 226326 DEBUG oslo_concurrency.lockutils [req-ff4c01a0-7ff1-4dc8-bf34-9c642c122574 req-e19dc28a-b541-44fc-863a-94c282fb2c53 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.929 226326 DEBUG nova.compute.manager [req-ff4c01a0-7ff1-4dc8-bf34-9c642c122574 req-e19dc28a-b541-44fc-863a-94c282fb2c53 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] No waiting events found dispatching network-vif-unplugged-93f5d287-34ec-424d-8776-df002083762e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.929 226326 DEBUG nova.compute.manager [req-ff4c01a0-7ff1-4dc8-bf34-9c642c122574 req-e19dc28a-b541-44fc-863a-94c282fb2c53 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Received event network-vif-unplugged-93f5d287-34ec-424d-8776-df002083762e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 26 10:13:11 compute-1 nova_compute[226322]: 2026-01-26 10:13:11.934 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:11 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.944 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[6e748751-705d-4735-a605-c76f235210f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:13:11 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.946 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[a5a1a263-18fb-4602-895b-77ea0eb182be]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:13:11 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.966 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[c71d3fc2-4fc8-44c9-835e-cba64ac57562]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431920, 'reachable_time': 20765, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235420, 'error': None, 'target': 'ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:13:11 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.971 143615 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 26 10:13:11 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.971 143615 DEBUG oslo.privsep.daemon [-] privsep: reply[3512caa1-81cb-4598-9d42-76405e84b099]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:13:11 compute-1 systemd[1]: run-netns-ovnmeta\x2dae1cb66c\x2d0987\x2d4156\x2d9bdb\x2dcb2a08957306.mount: Deactivated successfully.
Jan 26 10:13:12 compute-1 nova_compute[226322]: 2026-01-26 10:13:12.471 226326 INFO nova.virt.libvirt.driver [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Deleting instance files /var/lib/nova/instances/17d96116-83a4-40d0-9dcd-bd5072238621_del
Jan 26 10:13:12 compute-1 nova_compute[226322]: 2026-01-26 10:13:12.472 226326 INFO nova.virt.libvirt.driver [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Deletion of /var/lib/nova/instances/17d96116-83a4-40d0-9dcd-bd5072238621_del complete
Jan 26 10:13:12 compute-1 nova_compute[226322]: 2026-01-26 10:13:12.531 226326 INFO nova.compute.manager [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Took 1.47 seconds to destroy the instance on the hypervisor.
Jan 26 10:13:12 compute-1 nova_compute[226322]: 2026-01-26 10:13:12.532 226326 DEBUG oslo.service.loopingcall [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 26 10:13:12 compute-1 nova_compute[226322]: 2026-01-26 10:13:12.533 226326 DEBUG nova.compute.manager [-] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 10:13:12 compute-1 nova_compute[226322]: 2026-01-26 10:13:12.533 226326 DEBUG nova.network.neutron [-] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 10:13:13 compute-1 ceph-mon[80107]: pgmap v929: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 353 KiB/s rd, 2.1 MiB/s wr, 124 op/s
Jan 26 10:13:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:13.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:13 compute-1 nova_compute[226322]: 2026-01-26 10:13:13.474 226326 DEBUG nova.network.neutron [-] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:13:13 compute-1 nova_compute[226322]: 2026-01-26 10:13:13.501 226326 INFO nova.compute.manager [-] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Took 0.97 seconds to deallocate network for instance.
Jan 26 10:13:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:13.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:13 compute-1 nova_compute[226322]: 2026-01-26 10:13:13.546 226326 DEBUG nova.compute.manager [req-ce3bdba2-53ca-4c2c-b78f-868aa9aa2dd9 req-3dd56532-8664-45ae-a04a-7e062b24d5be b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Received event network-vif-deleted-93f5d287-34ec-424d-8776-df002083762e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:13:13 compute-1 nova_compute[226322]: 2026-01-26 10:13:13.565 226326 DEBUG oslo_concurrency.lockutils [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:13:13 compute-1 nova_compute[226322]: 2026-01-26 10:13:13.565 226326 DEBUG oslo_concurrency.lockutils [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:13:13 compute-1 nova_compute[226322]: 2026-01-26 10:13:13.615 226326 DEBUG oslo_concurrency.processutils [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:13:13 compute-1 sudo[235424]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:13:13 compute-1 sudo[235424]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:13:13 compute-1 sudo[235424]: pam_unix(sudo:session): session closed for user root
Jan 26 10:13:13 compute-1 sudo[235449]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:13:13 compute-1 sudo[235449]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:13:13 compute-1 nova_compute[226322]: 2026-01-26 10:13:13.994 226326 DEBUG nova.compute.manager [req-ea02cfb7-930f-4e15-a5c3-327bbdd4e57c req-090ec7a2-64ab-4820-bda9-0bd24f87bc57 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Received event network-vif-plugged-93f5d287-34ec-424d-8776-df002083762e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:13:13 compute-1 nova_compute[226322]: 2026-01-26 10:13:13.995 226326 DEBUG oslo_concurrency.lockutils [req-ea02cfb7-930f-4e15-a5c3-327bbdd4e57c req-090ec7a2-64ab-4820-bda9-0bd24f87bc57 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:13:13 compute-1 nova_compute[226322]: 2026-01-26 10:13:13.995 226326 DEBUG oslo_concurrency.lockutils [req-ea02cfb7-930f-4e15-a5c3-327bbdd4e57c req-090ec7a2-64ab-4820-bda9-0bd24f87bc57 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:13:13 compute-1 nova_compute[226322]: 2026-01-26 10:13:13.996 226326 DEBUG oslo_concurrency.lockutils [req-ea02cfb7-930f-4e15-a5c3-327bbdd4e57c req-090ec7a2-64ab-4820-bda9-0bd24f87bc57 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:13:13 compute-1 nova_compute[226322]: 2026-01-26 10:13:13.996 226326 DEBUG nova.compute.manager [req-ea02cfb7-930f-4e15-a5c3-327bbdd4e57c req-090ec7a2-64ab-4820-bda9-0bd24f87bc57 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] No waiting events found dispatching network-vif-plugged-93f5d287-34ec-424d-8776-df002083762e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 10:13:13 compute-1 nova_compute[226322]: 2026-01-26 10:13:13.996 226326 WARNING nova.compute.manager [req-ea02cfb7-930f-4e15-a5c3-327bbdd4e57c req-090ec7a2-64ab-4820-bda9-0bd24f87bc57 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Received unexpected event network-vif-plugged-93f5d287-34ec-424d-8776-df002083762e for instance with vm_state deleted and task_state None.
Jan 26 10:13:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:13:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:13:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:13:14 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:14 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:13:14 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:13:14 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1814321471' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:13:14 compute-1 nova_compute[226322]: 2026-01-26 10:13:14.061 226326 DEBUG oslo_concurrency.processutils [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:13:14 compute-1 nova_compute[226322]: 2026-01-26 10:13:14.069 226326 DEBUG nova.compute.provider_tree [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:13:14 compute-1 ceph-mon[80107]: pgmap v930: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 65 KiB/s rd, 96 KiB/s wr, 64 op/s
Jan 26 10:13:14 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1814321471' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:13:14 compute-1 nova_compute[226322]: 2026-01-26 10:13:14.083 226326 DEBUG nova.scheduler.client.report [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:13:14 compute-1 nova_compute[226322]: 2026-01-26 10:13:14.109 226326 DEBUG oslo_concurrency.lockutils [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.544s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:13:14 compute-1 nova_compute[226322]: 2026-01-26 10:13:14.137 226326 INFO nova.scheduler.client.report [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Deleted allocations for instance 17d96116-83a4-40d0-9dcd-bd5072238621
Jan 26 10:13:14 compute-1 nova_compute[226322]: 2026-01-26 10:13:14.196 226326 DEBUG oslo_concurrency.lockutils [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:13:14 compute-1 sudo[235449]: pam_unix(sudo:session): session closed for user root
Jan 26 10:13:14 compute-1 nova_compute[226322]: 2026-01-26 10:13:14.793 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:15 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:13:15 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:13:15 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:13:15 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:13:15 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:13:15 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:13:15 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:13:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:15.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:15.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:16 compute-1 ceph-mon[80107]: pgmap v931: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 84 KiB/s rd, 98 KiB/s wr, 92 op/s
Jan 26 10:13:16 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:13:16 compute-1 nova_compute[226322]: 2026-01-26 10:13:16.709 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 10:13:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:17.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 10:13:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:17.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:18 compute-1 nova_compute[226322]: 2026-01-26 10:13:18.333 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:18 compute-1 ceph-mon[80107]: pgmap v932: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 37 KiB/s rd, 16 KiB/s wr, 57 op/s
Jan 26 10:13:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:13:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:13:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:13:19 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:13:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:19.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:19.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:19 compute-1 nova_compute[226322]: 2026-01-26 10:13:19.794 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:20 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:13:20 compute-1 sudo[235528]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:13:20 compute-1 sudo[235528]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:13:20 compute-1 sudo[235528]: pam_unix(sudo:session): session closed for user root
Jan 26 10:13:21 compute-1 ceph-mon[80107]: pgmap v933: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 37 KiB/s rd, 16 KiB/s wr, 57 op/s
Jan 26 10:13:21 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:13:21 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:13:21 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:13:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:13:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:21.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:13:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:13:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:21.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:13:21 compute-1 nova_compute[226322]: 2026-01-26 10:13:21.713 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:22 compute-1 podman[235554]: 2026-01-26 10:13:22.312546397 +0000 UTC m=+0.081670936 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 10:13:22 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/4042908871' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:13:22 compute-1 ceph-mon[80107]: pgmap v934: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 44 KiB/s rd, 17 KiB/s wr, 66 op/s
Jan 26 10:13:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:23.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 10:13:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:23.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 10:13:23 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:13:23.878 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 10:13:23 compute-1 nova_compute[226322]: 2026-01-26 10:13:23.879 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:23 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:13:23.880 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 10:13:23 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:13:23.882 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:13:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:13:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:13:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:13:24 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:13:24 compute-1 sudo[235574]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:13:24 compute-1 sudo[235574]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:13:24 compute-1 sudo[235574]: pam_unix(sudo:session): session closed for user root
Jan 26 10:13:24 compute-1 nova_compute[226322]: 2026-01-26 10:13:24.797 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:25 compute-1 ceph-mon[80107]: pgmap v935: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 2.4 KiB/s wr, 37 op/s
Jan 26 10:13:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:13:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:25.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:13:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 10:13:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:25.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 10:13:26 compute-1 ceph-mon[80107]: pgmap v936: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 39 KiB/s rd, 3.3 KiB/s wr, 57 op/s
Jan 26 10:13:26 compute-1 nova_compute[226322]: 2026-01-26 10:13:26.298 226326 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769422391.2958548, 17d96116-83a4-40d0-9dcd-bd5072238621 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 10:13:26 compute-1 nova_compute[226322]: 2026-01-26 10:13:26.298 226326 INFO nova.compute.manager [-] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] VM Stopped (Lifecycle Event)
Jan 26 10:13:26 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:13:26 compute-1 nova_compute[226322]: 2026-01-26 10:13:26.637 226326 DEBUG nova.compute.manager [None req-c3236d7f-3751-4c0d-8edd-1d434244810c - - - - - -] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 10:13:26 compute-1 nova_compute[226322]: 2026-01-26 10:13:26.716 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:27.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:27.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:28 compute-1 sshd-session[235601]: Invalid user admin from 178.62.249.31 port 40642
Jan 26 10:13:28 compute-1 ceph-mon[80107]: pgmap v937: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 2.2 KiB/s wr, 28 op/s
Jan 26 10:13:28 compute-1 sshd-session[235601]: Connection closed by invalid user admin 178.62.249.31 port 40642 [preauth]
Jan 26 10:13:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:13:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:13:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:29 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:13:29 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:29 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:13:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 10:13:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:29.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 10:13:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:29.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:29 compute-1 nova_compute[226322]: 2026-01-26 10:13:29.799 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:31 compute-1 ceph-mon[80107]: pgmap v938: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 2.2 KiB/s wr, 28 op/s
Jan 26 10:13:31 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:13:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:31.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:13:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:31.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:13:31 compute-1 nova_compute[226322]: 2026-01-26 10:13:31.719 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:31 compute-1 nova_compute[226322]: 2026-01-26 10:13:31.811 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:13:33 compute-1 ceph-mon[80107]: pgmap v939: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 2.2 KiB/s wr, 29 op/s
Jan 26 10:13:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 10:13:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:33.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 10:13:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:33.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:13:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:13:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:13:34 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:34 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:13:34 compute-1 nova_compute[226322]: 2026-01-26 10:13:34.436 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:13:34 compute-1 nova_compute[226322]: 2026-01-26 10:13:34.437 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:13:34 compute-1 nova_compute[226322]: 2026-01-26 10:13:34.437 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:13:34 compute-1 ceph-mon[80107]: pgmap v940: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 14 KiB/s rd, 938 B/s wr, 20 op/s
Jan 26 10:13:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:13:34 compute-1 nova_compute[226322]: 2026-01-26 10:13:34.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:13:34 compute-1 nova_compute[226322]: 2026-01-26 10:13:34.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:13:34 compute-1 nova_compute[226322]: 2026-01-26 10:13:34.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:13:34 compute-1 nova_compute[226322]: 2026-01-26 10:13:34.688 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:13:34 compute-1 nova_compute[226322]: 2026-01-26 10:13:34.730 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 10:13:34 compute-1 nova_compute[226322]: 2026-01-26 10:13:34.730 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:13:34 compute-1 nova_compute[226322]: 2026-01-26 10:13:34.801 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:35.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:35.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:35 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1648272998' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:13:35 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2322081899' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:13:35 compute-1 nova_compute[226322]: 2026-01-26 10:13:35.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:13:35 compute-1 nova_compute[226322]: 2026-01-26 10:13:35.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:13:35 compute-1 nova_compute[226322]: 2026-01-26 10:13:35.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:13:36 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:13:36 compute-1 nova_compute[226322]: 2026-01-26 10:13:36.722 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:36 compute-1 ceph-mon[80107]: pgmap v941: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 14 KiB/s rd, 938 B/s wr, 20 op/s
Jan 26 10:13:36 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3631170278' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:13:36 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1416506356' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:13:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:37.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:37.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:37 compute-1 nova_compute[226322]: 2026-01-26 10:13:37.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:13:37 compute-1 nova_compute[226322]: 2026-01-26 10:13:37.788 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:13:37 compute-1 nova_compute[226322]: 2026-01-26 10:13:37.789 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:13:37 compute-1 nova_compute[226322]: 2026-01-26 10:13:37.789 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:13:37 compute-1 nova_compute[226322]: 2026-01-26 10:13:37.789 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:13:37 compute-1 nova_compute[226322]: 2026-01-26 10:13:37.789 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:13:38 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:13:38 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2944046988' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:13:38 compute-1 nova_compute[226322]: 2026-01-26 10:13:38.207 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:13:38 compute-1 podman[235630]: 2026-01-26 10:13:38.307859092 +0000 UTC m=+0.081405539 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 10:13:38 compute-1 nova_compute[226322]: 2026-01-26 10:13:38.368 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:13:38 compute-1 nova_compute[226322]: 2026-01-26 10:13:38.369 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4901MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:13:38 compute-1 nova_compute[226322]: 2026-01-26 10:13:38.369 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:13:38 compute-1 nova_compute[226322]: 2026-01-26 10:13:38.369 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:13:38 compute-1 nova_compute[226322]: 2026-01-26 10:13:38.507 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:13:38 compute-1 nova_compute[226322]: 2026-01-26 10:13:38.508 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:13:38 compute-1 nova_compute[226322]: 2026-01-26 10:13:38.532 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:13:38 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:13:38 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/486237949' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:13:38 compute-1 nova_compute[226322]: 2026-01-26 10:13:38.989 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:13:38 compute-1 nova_compute[226322]: 2026-01-26 10:13:38.994 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:13:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:13:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:13:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:13:39 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:13:39 compute-1 ceph-mon[80107]: pgmap v942: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:13:39 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2944046988' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:13:39 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/486237949' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:13:39 compute-1 nova_compute[226322]: 2026-01-26 10:13:39.219 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:13:39 compute-1 nova_compute[226322]: 2026-01-26 10:13:39.376 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:13:39 compute-1 nova_compute[226322]: 2026-01-26 10:13:39.377 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:13:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:39.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:39.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:39 compute-1 nova_compute[226322]: 2026-01-26 10:13:39.803 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:40 compute-1 ceph-mon[80107]: pgmap v943: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:13:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:13:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:41.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:13:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:41.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:41 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:13:41 compute-1 nova_compute[226322]: 2026-01-26 10:13:41.726 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:42 compute-1 ceph-mon[80107]: pgmap v944: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:13:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:13:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:13:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:13:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:13:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:43.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:43.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:44 compute-1 sudo[235682]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:13:44 compute-1 sudo[235682]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:13:44 compute-1 sudo[235682]: pam_unix(sudo:session): session closed for user root
Jan 26 10:13:44 compute-1 nova_compute[226322]: 2026-01-26 10:13:44.805 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:45 compute-1 ceph-mon[80107]: pgmap v945: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:13:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:45.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:45.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:46 compute-1 ceph-mon[80107]: pgmap v946: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:13:46 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:13:46 compute-1 nova_compute[226322]: 2026-01-26 10:13:46.766 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:47.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:47.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:13:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:13:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:13:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:13:48 compute-1 ceph-mon[80107]: pgmap v947: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:13:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 10:13:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:49.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 10:13:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:49.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:49 compute-1 nova_compute[226322]: 2026-01-26 10:13:49.808 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:13:50 compute-1 sshd-session[235709]: Invalid user admin from 152.42.136.110 port 51532
Jan 26 10:13:50 compute-1 sshd-session[235709]: Connection closed by invalid user admin 152.42.136.110 port 51532 [preauth]
Jan 26 10:13:50 compute-1 ceph-mon[80107]: pgmap v948: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:13:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:51.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:13:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:51.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:13:51 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:13:51 compute-1 nova_compute[226322]: 2026-01-26 10:13:51.769 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:13:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:13:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:13:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:13:53 compute-1 ceph-mon[80107]: pgmap v949: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:13:53 compute-1 podman[235713]: 2026-01-26 10:13:53.292081697 +0000 UTC m=+0.068792913 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 26 10:13:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:53.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:53.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:13:53.937 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:13:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:13:53.937 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:13:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:13:53.937 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:13:54 compute-1 ceph-mon[80107]: pgmap v950: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:13:54 compute-1 sshd-session[235733]: Invalid user centos from 104.248.198.71 port 34464
Jan 26 10:13:54 compute-1 sshd-session[235733]: Connection closed by invalid user centos 104.248.198.71 port 34464 [preauth]
Jan 26 10:13:54 compute-1 nova_compute[226322]: 2026-01-26 10:13:54.810 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:55.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:13:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:55.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:13:56 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:13:56 compute-1 nova_compute[226322]: 2026-01-26 10:13:56.772 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:13:56 compute-1 ceph-mon[80107]: pgmap v951: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:13:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:13:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:57.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:13:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:13:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:57.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:13:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:13:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:13:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:13:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:13:58 compute-1 ceph-mon[80107]: pgmap v952: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:13:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/465910940' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:13:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/465910940' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:13:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.003000089s ======
Jan 26 10:13:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:59.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000089s
Jan 26 10:13:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:13:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 10:13:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:59.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 10:13:59 compute-1 nova_compute[226322]: 2026-01-26 10:13:59.812 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:00 compute-1 ceph-mon[80107]: pgmap v953: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:14:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:01.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:01.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:01 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:14:01 compute-1 nova_compute[226322]: 2026-01-26 10:14:01.775 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:02 compute-1 ceph-mon[80107]: pgmap v954: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:14:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:14:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:14:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:14:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:14:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:03.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:03.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:14:04 compute-1 nova_compute[226322]: 2026-01-26 10:14:04.424 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "51ec8779-f667-4f68-853c-545679d761b9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:14:04 compute-1 nova_compute[226322]: 2026-01-26 10:14:04.425 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:14:04 compute-1 nova_compute[226322]: 2026-01-26 10:14:04.446 226326 DEBUG nova.compute.manager [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 10:14:04 compute-1 sudo[235740]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:14:04 compute-1 sudo[235740]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:14:04 compute-1 sudo[235740]: pam_unix(sudo:session): session closed for user root
Jan 26 10:14:04 compute-1 nova_compute[226322]: 2026-01-26 10:14:04.539 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:14:04 compute-1 nova_compute[226322]: 2026-01-26 10:14:04.539 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:14:04 compute-1 nova_compute[226322]: 2026-01-26 10:14:04.546 226326 DEBUG nova.virt.hardware [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 10:14:04 compute-1 nova_compute[226322]: 2026-01-26 10:14:04.546 226326 INFO nova.compute.claims [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Claim successful on node compute-1.ctlplane.example.com
Jan 26 10:14:04 compute-1 nova_compute[226322]: 2026-01-26 10:14:04.693 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:14:04 compute-1 ceph-mon[80107]: pgmap v955: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:14:04 compute-1 nova_compute[226322]: 2026-01-26 10:14:04.814 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:05 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:14:05 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/12435975' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:14:05 compute-1 nova_compute[226322]: 2026-01-26 10:14:05.177 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:14:05 compute-1 nova_compute[226322]: 2026-01-26 10:14:05.182 226326 DEBUG nova.compute.provider_tree [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:14:05 compute-1 nova_compute[226322]: 2026-01-26 10:14:05.201 226326 DEBUG nova.scheduler.client.report [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:14:05 compute-1 nova_compute[226322]: 2026-01-26 10:14:05.232 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.693s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:14:05 compute-1 nova_compute[226322]: 2026-01-26 10:14:05.233 226326 DEBUG nova.compute.manager [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 10:14:05 compute-1 nova_compute[226322]: 2026-01-26 10:14:05.287 226326 DEBUG nova.compute.manager [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 10:14:05 compute-1 nova_compute[226322]: 2026-01-26 10:14:05.288 226326 DEBUG nova.network.neutron [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 10:14:05 compute-1 nova_compute[226322]: 2026-01-26 10:14:05.306 226326 INFO nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 10:14:05 compute-1 nova_compute[226322]: 2026-01-26 10:14:05.325 226326 DEBUG nova.compute.manager [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 10:14:05 compute-1 nova_compute[226322]: 2026-01-26 10:14:05.412 226326 DEBUG nova.compute.manager [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 10:14:05 compute-1 nova_compute[226322]: 2026-01-26 10:14:05.414 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 10:14:05 compute-1 nova_compute[226322]: 2026-01-26 10:14:05.414 226326 INFO nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Creating image(s)
Jan 26 10:14:05 compute-1 nova_compute[226322]: 2026-01-26 10:14:05.439 226326 DEBUG nova.storage.rbd_utils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 51ec8779-f667-4f68-853c-545679d761b9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:14:05 compute-1 nova_compute[226322]: 2026-01-26 10:14:05.467 226326 DEBUG nova.storage.rbd_utils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 51ec8779-f667-4f68-853c-545679d761b9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:14:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 10:14:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:05.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 10:14:05 compute-1 nova_compute[226322]: 2026-01-26 10:14:05.491 226326 DEBUG nova.storage.rbd_utils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 51ec8779-f667-4f68-853c-545679d761b9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:14:05 compute-1 nova_compute[226322]: 2026-01-26 10:14:05.494 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:14:05 compute-1 nova_compute[226322]: 2026-01-26 10:14:05.545 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:14:05 compute-1 nova_compute[226322]: 2026-01-26 10:14:05.546 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "d81880e926e175d0cc7241caa7cc18231a8a289c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:14:05 compute-1 nova_compute[226322]: 2026-01-26 10:14:05.547 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "d81880e926e175d0cc7241caa7cc18231a8a289c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:14:05 compute-1 nova_compute[226322]: 2026-01-26 10:14:05.547 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "d81880e926e175d0cc7241caa7cc18231a8a289c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:14:05 compute-1 nova_compute[226322]: 2026-01-26 10:14:05.571 226326 DEBUG nova.storage.rbd_utils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 51ec8779-f667-4f68-853c-545679d761b9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:14:05 compute-1 nova_compute[226322]: 2026-01-26 10:14:05.574 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c 51ec8779-f667-4f68-853c-545679d761b9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:14:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:14:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:05.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:14:05 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/12435975' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:14:05 compute-1 nova_compute[226322]: 2026-01-26 10:14:05.822 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c 51ec8779-f667-4f68-853c-545679d761b9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.248s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:14:05 compute-1 nova_compute[226322]: 2026-01-26 10:14:05.890 226326 DEBUG nova.storage.rbd_utils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] resizing rbd image 51ec8779-f667-4f68-853c-545679d761b9_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Jan 26 10:14:05 compute-1 nova_compute[226322]: 2026-01-26 10:14:05.996 226326 DEBUG nova.objects.instance [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'migration_context' on Instance uuid 51ec8779-f667-4f68-853c-545679d761b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 10:14:06 compute-1 nova_compute[226322]: 2026-01-26 10:14:06.010 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 26 10:14:06 compute-1 nova_compute[226322]: 2026-01-26 10:14:06.010 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Ensure instance console log exists: /var/lib/nova/instances/51ec8779-f667-4f68-853c-545679d761b9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 26 10:14:06 compute-1 nova_compute[226322]: 2026-01-26 10:14:06.011 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:14:06 compute-1 nova_compute[226322]: 2026-01-26 10:14:06.011 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:14:06 compute-1 nova_compute[226322]: 2026-01-26 10:14:06.011 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:14:06 compute-1 nova_compute[226322]: 2026-01-26 10:14:06.222 226326 DEBUG nova.policy [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c1208d3e25b940ea93fe76884c7a53db', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 10:14:06 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:14:06 compute-1 nova_compute[226322]: 2026-01-26 10:14:06.778 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:06 compute-1 ceph-mon[80107]: pgmap v956: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:14:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:07.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:07.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:14:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:14:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:14:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:14:08 compute-1 nova_compute[226322]: 2026-01-26 10:14:08.365 226326 DEBUG nova.network.neutron [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Successfully updated port: 386a7730-6a16-4b18-b368-561762a8f7af _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 10:14:08 compute-1 nova_compute[226322]: 2026-01-26 10:14:08.453 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "refresh_cache-51ec8779-f667-4f68-853c-545679d761b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 10:14:08 compute-1 nova_compute[226322]: 2026-01-26 10:14:08.454 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquired lock "refresh_cache-51ec8779-f667-4f68-853c-545679d761b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 10:14:08 compute-1 nova_compute[226322]: 2026-01-26 10:14:08.454 226326 DEBUG nova.network.neutron [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 10:14:08 compute-1 nova_compute[226322]: 2026-01-26 10:14:08.459 226326 DEBUG nova.compute.manager [req-eb7a636b-e5ae-49f8-bb40-ddfaa0576843 req-d58107c8-25f1-4e59-a671-4579ab4d606d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Received event network-changed-386a7730-6a16-4b18-b368-561762a8f7af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:14:08 compute-1 nova_compute[226322]: 2026-01-26 10:14:08.460 226326 DEBUG nova.compute.manager [req-eb7a636b-e5ae-49f8-bb40-ddfaa0576843 req-d58107c8-25f1-4e59-a671-4579ab4d606d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Refreshing instance network info cache due to event network-changed-386a7730-6a16-4b18-b368-561762a8f7af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 10:14:08 compute-1 nova_compute[226322]: 2026-01-26 10:14:08.460 226326 DEBUG oslo_concurrency.lockutils [req-eb7a636b-e5ae-49f8-bb40-ddfaa0576843 req-d58107c8-25f1-4e59-a671-4579ab4d606d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "refresh_cache-51ec8779-f667-4f68-853c-545679d761b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 10:14:08 compute-1 ceph-mon[80107]: pgmap v957: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:14:09 compute-1 nova_compute[226322]: 2026-01-26 10:14:09.106 226326 DEBUG nova.network.neutron [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 10:14:09 compute-1 podman[235956]: 2026-01-26 10:14:09.31017897 +0000 UTC m=+0.090285013 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 10:14:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:09.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:14:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:09.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:14:09 compute-1 nova_compute[226322]: 2026-01-26 10:14:09.815 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:10 compute-1 nova_compute[226322]: 2026-01-26 10:14:10.495 226326 DEBUG nova.network.neutron [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Updating instance_info_cache with network_info: [{"id": "386a7730-6a16-4b18-b368-561762a8f7af", "address": "fa:16:3e:d6:e4:a1", "network": {"id": "f91dcb4b-184c-45d6-a0e9-285bb6bc3464", "bridge": "br-int", "label": "tempest-network-smoke--753987758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386a7730-6a", "ovs_interfaceid": "386a7730-6a16-4b18-b368-561762a8f7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:14:10 compute-1 nova_compute[226322]: 2026-01-26 10:14:10.514 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Releasing lock "refresh_cache-51ec8779-f667-4f68-853c-545679d761b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 10:14:10 compute-1 nova_compute[226322]: 2026-01-26 10:14:10.514 226326 DEBUG nova.compute.manager [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Instance network_info: |[{"id": "386a7730-6a16-4b18-b368-561762a8f7af", "address": "fa:16:3e:d6:e4:a1", "network": {"id": "f91dcb4b-184c-45d6-a0e9-285bb6bc3464", "bridge": "br-int", "label": "tempest-network-smoke--753987758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386a7730-6a", "ovs_interfaceid": "386a7730-6a16-4b18-b368-561762a8f7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 10:14:10 compute-1 nova_compute[226322]: 2026-01-26 10:14:10.515 226326 DEBUG oslo_concurrency.lockutils [req-eb7a636b-e5ae-49f8-bb40-ddfaa0576843 req-d58107c8-25f1-4e59-a671-4579ab4d606d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquired lock "refresh_cache-51ec8779-f667-4f68-853c-545679d761b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 10:14:10 compute-1 nova_compute[226322]: 2026-01-26 10:14:10.515 226326 DEBUG nova.network.neutron [req-eb7a636b-e5ae-49f8-bb40-ddfaa0576843 req-d58107c8-25f1-4e59-a671-4579ab4d606d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Refreshing network info cache for port 386a7730-6a16-4b18-b368-561762a8f7af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 10:14:10 compute-1 nova_compute[226322]: 2026-01-26 10:14:10.517 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Start _get_guest_xml network_info=[{"id": "386a7730-6a16-4b18-b368-561762a8f7af", "address": "fa:16:3e:d6:e4:a1", "network": {"id": "f91dcb4b-184c-45d6-a0e9-285bb6bc3464", "bridge": "br-int", "label": "tempest-network-smoke--753987758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386a7730-6a", "ovs_interfaceid": "386a7730-6a16-4b18-b368-561762a8f7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T10:05:39Z,direct_url=<?>,disk_format='qcow2',id=6789692f-fc1f-4efa-ae75-dcc13be695ef,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3ff3fa2a5531460b993c609589aa545d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T10:05:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'size': 0, 'image_id': '6789692f-fc1f-4efa-ae75-dcc13be695ef'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 26 10:14:10 compute-1 nova_compute[226322]: 2026-01-26 10:14:10.521 226326 WARNING nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:14:10 compute-1 nova_compute[226322]: 2026-01-26 10:14:10.526 226326 DEBUG nova.virt.libvirt.host [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 26 10:14:10 compute-1 nova_compute[226322]: 2026-01-26 10:14:10.526 226326 DEBUG nova.virt.libvirt.host [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 26 10:14:10 compute-1 nova_compute[226322]: 2026-01-26 10:14:10.531 226326 DEBUG nova.virt.libvirt.host [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 26 10:14:10 compute-1 nova_compute[226322]: 2026-01-26 10:14:10.531 226326 DEBUG nova.virt.libvirt.host [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 26 10:14:10 compute-1 nova_compute[226322]: 2026-01-26 10:14:10.532 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 26 10:14:10 compute-1 nova_compute[226322]: 2026-01-26 10:14:10.532 226326 DEBUG nova.virt.hardware [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T10:05:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='57e1601b-dbfa-4d3b-8b96-27302e4a7a06',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T10:05:39Z,direct_url=<?>,disk_format='qcow2',id=6789692f-fc1f-4efa-ae75-dcc13be695ef,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3ff3fa2a5531460b993c609589aa545d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T10:05:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 26 10:14:10 compute-1 nova_compute[226322]: 2026-01-26 10:14:10.532 226326 DEBUG nova.virt.hardware [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 26 10:14:10 compute-1 nova_compute[226322]: 2026-01-26 10:14:10.532 226326 DEBUG nova.virt.hardware [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 26 10:14:10 compute-1 nova_compute[226322]: 2026-01-26 10:14:10.533 226326 DEBUG nova.virt.hardware [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 26 10:14:10 compute-1 nova_compute[226322]: 2026-01-26 10:14:10.533 226326 DEBUG nova.virt.hardware [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 26 10:14:10 compute-1 nova_compute[226322]: 2026-01-26 10:14:10.533 226326 DEBUG nova.virt.hardware [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 26 10:14:10 compute-1 nova_compute[226322]: 2026-01-26 10:14:10.533 226326 DEBUG nova.virt.hardware [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 26 10:14:10 compute-1 nova_compute[226322]: 2026-01-26 10:14:10.533 226326 DEBUG nova.virt.hardware [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 26 10:14:10 compute-1 nova_compute[226322]: 2026-01-26 10:14:10.534 226326 DEBUG nova.virt.hardware [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 26 10:14:10 compute-1 nova_compute[226322]: 2026-01-26 10:14:10.534 226326 DEBUG nova.virt.hardware [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 26 10:14:10 compute-1 nova_compute[226322]: 2026-01-26 10:14:10.534 226326 DEBUG nova.virt.hardware [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 26 10:14:10 compute-1 nova_compute[226322]: 2026-01-26 10:14:10.536 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:14:10 compute-1 ceph-mon[80107]: pgmap v958: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:14:10 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 26 10:14:10 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/49620544' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.013 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.035 226326 DEBUG nova.storage.rbd_utils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 51ec8779-f667-4f68-853c-545679d761b9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.039 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:14:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:14:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:11.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:14:11 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 26 10:14:11 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3883266146' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.526 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.529 226326 DEBUG nova.virt.libvirt.vif [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T10:14:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1744381875',display_name='tempest-TestNetworkBasicOps-server-1744381875',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1744381875',id=8,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAvyk/Go9GRUgKM9ccY9S+v3kz7mMcjasfiu2L0DFwS5oDB3yiDxzsS07sSdloLffH02y1mQmQVvZg5ozr00/t6RFvm10CNHliA9YweQcnIUE4iIcxPZtBU7hWU+AN6Ubw==',key_name='tempest-TestNetworkBasicOps-212494757',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-u51vza1r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T10:14:05Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=51ec8779-f667-4f68-853c-545679d761b9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "386a7730-6a16-4b18-b368-561762a8f7af", "address": "fa:16:3e:d6:e4:a1", "network": {"id": "f91dcb4b-184c-45d6-a0e9-285bb6bc3464", "bridge": "br-int", "label": "tempest-network-smoke--753987758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386a7730-6a", "ovs_interfaceid": "386a7730-6a16-4b18-b368-561762a8f7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.529 226326 DEBUG nova.network.os_vif_util [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "386a7730-6a16-4b18-b368-561762a8f7af", "address": "fa:16:3e:d6:e4:a1", "network": {"id": "f91dcb4b-184c-45d6-a0e9-285bb6bc3464", "bridge": "br-int", "label": "tempest-network-smoke--753987758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386a7730-6a", "ovs_interfaceid": "386a7730-6a16-4b18-b368-561762a8f7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.530 226326 DEBUG nova.network.os_vif_util [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:e4:a1,bridge_name='br-int',has_traffic_filtering=True,id=386a7730-6a16-4b18-b368-561762a8f7af,network=Network(f91dcb4b-184c-45d6-a0e9-285bb6bc3464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap386a7730-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.531 226326 DEBUG nova.objects.instance [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 51ec8779-f667-4f68-853c-545679d761b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.544 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] End _get_guest_xml xml=<domain type="kvm">
Jan 26 10:14:11 compute-1 nova_compute[226322]:   <uuid>51ec8779-f667-4f68-853c-545679d761b9</uuid>
Jan 26 10:14:11 compute-1 nova_compute[226322]:   <name>instance-00000008</name>
Jan 26 10:14:11 compute-1 nova_compute[226322]:   <memory>131072</memory>
Jan 26 10:14:11 compute-1 nova_compute[226322]:   <vcpu>1</vcpu>
Jan 26 10:14:11 compute-1 nova_compute[226322]:   <metadata>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <nova:name>tempest-TestNetworkBasicOps-server-1744381875</nova:name>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <nova:creationTime>2026-01-26 10:14:10</nova:creationTime>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <nova:flavor name="m1.nano">
Jan 26 10:14:11 compute-1 nova_compute[226322]:         <nova:memory>128</nova:memory>
Jan 26 10:14:11 compute-1 nova_compute[226322]:         <nova:disk>1</nova:disk>
Jan 26 10:14:11 compute-1 nova_compute[226322]:         <nova:swap>0</nova:swap>
Jan 26 10:14:11 compute-1 nova_compute[226322]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 10:14:11 compute-1 nova_compute[226322]:         <nova:vcpus>1</nova:vcpus>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       </nova:flavor>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <nova:owner>
Jan 26 10:14:11 compute-1 nova_compute[226322]:         <nova:user uuid="c1208d3e25b940ea93fe76884c7a53db">tempest-TestNetworkBasicOps-966559857-project-member</nova:user>
Jan 26 10:14:11 compute-1 nova_compute[226322]:         <nova:project uuid="6ed221b375a44fc2bb2a8f232c5446e7">tempest-TestNetworkBasicOps-966559857</nova:project>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       </nova:owner>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <nova:root type="image" uuid="6789692f-fc1f-4efa-ae75-dcc13be695ef"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <nova:ports>
Jan 26 10:14:11 compute-1 nova_compute[226322]:         <nova:port uuid="386a7730-6a16-4b18-b368-561762a8f7af">
Jan 26 10:14:11 compute-1 nova_compute[226322]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:         </nova:port>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       </nova:ports>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     </nova:instance>
Jan 26 10:14:11 compute-1 nova_compute[226322]:   </metadata>
Jan 26 10:14:11 compute-1 nova_compute[226322]:   <sysinfo type="smbios">
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <system>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <entry name="manufacturer">RDO</entry>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <entry name="product">OpenStack Compute</entry>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <entry name="serial">51ec8779-f667-4f68-853c-545679d761b9</entry>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <entry name="uuid">51ec8779-f667-4f68-853c-545679d761b9</entry>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <entry name="family">Virtual Machine</entry>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     </system>
Jan 26 10:14:11 compute-1 nova_compute[226322]:   </sysinfo>
Jan 26 10:14:11 compute-1 nova_compute[226322]:   <os>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <boot dev="hd"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <smbios mode="sysinfo"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:   </os>
Jan 26 10:14:11 compute-1 nova_compute[226322]:   <features>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <acpi/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <apic/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <vmcoreinfo/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:   </features>
Jan 26 10:14:11 compute-1 nova_compute[226322]:   <clock offset="utc">
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <timer name="hpet" present="no"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:   </clock>
Jan 26 10:14:11 compute-1 nova_compute[226322]:   <cpu mode="host-model" match="exact">
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:   </cpu>
Jan 26 10:14:11 compute-1 nova_compute[226322]:   <devices>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <disk type="network" device="disk">
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <driver type="raw" cache="none"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <source protocol="rbd" name="vms/51ec8779-f667-4f68-853c-545679d761b9_disk">
Jan 26 10:14:11 compute-1 nova_compute[226322]:         <host name="192.168.122.100" port="6789"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:         <host name="192.168.122.102" port="6789"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:         <host name="192.168.122.101" port="6789"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       </source>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <auth username="openstack">
Jan 26 10:14:11 compute-1 nova_compute[226322]:         <secret type="ceph" uuid="1a70b85d-e3fd-5814-8a6a-37ea00fcae30"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       </auth>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <target dev="vda" bus="virtio"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     </disk>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <disk type="network" device="cdrom">
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <driver type="raw" cache="none"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <source protocol="rbd" name="vms/51ec8779-f667-4f68-853c-545679d761b9_disk.config">
Jan 26 10:14:11 compute-1 nova_compute[226322]:         <host name="192.168.122.100" port="6789"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:         <host name="192.168.122.102" port="6789"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:         <host name="192.168.122.101" port="6789"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       </source>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <auth username="openstack">
Jan 26 10:14:11 compute-1 nova_compute[226322]:         <secret type="ceph" uuid="1a70b85d-e3fd-5814-8a6a-37ea00fcae30"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       </auth>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <target dev="sda" bus="sata"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     </disk>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <interface type="ethernet">
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <mac address="fa:16:3e:d6:e4:a1"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <model type="virtio"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <mtu size="1442"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <target dev="tap386a7730-6a"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     </interface>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <serial type="pty">
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <log file="/var/lib/nova/instances/51ec8779-f667-4f68-853c-545679d761b9/console.log" append="off"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     </serial>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <video>
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <model type="virtio"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     </video>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <input type="tablet" bus="usb"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <rng model="virtio">
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <backend model="random">/dev/urandom</backend>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     </rng>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <controller type="usb" index="0"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     <memballoon model="virtio">
Jan 26 10:14:11 compute-1 nova_compute[226322]:       <stats period="10"/>
Jan 26 10:14:11 compute-1 nova_compute[226322]:     </memballoon>
Jan 26 10:14:11 compute-1 nova_compute[226322]:   </devices>
Jan 26 10:14:11 compute-1 nova_compute[226322]: </domain>
Jan 26 10:14:11 compute-1 nova_compute[226322]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.546 226326 DEBUG nova.compute.manager [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Preparing to wait for external event network-vif-plugged-386a7730-6a16-4b18-b368-561762a8f7af prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.546 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "51ec8779-f667-4f68-853c-545679d761b9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.546 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.547 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.547 226326 DEBUG nova.virt.libvirt.vif [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T10:14:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1744381875',display_name='tempest-TestNetworkBasicOps-server-1744381875',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1744381875',id=8,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAvyk/Go9GRUgKM9ccY9S+v3kz7mMcjasfiu2L0DFwS5oDB3yiDxzsS07sSdloLffH02y1mQmQVvZg5ozr00/t6RFvm10CNHliA9YweQcnIUE4iIcxPZtBU7hWU+AN6Ubw==',key_name='tempest-TestNetworkBasicOps-212494757',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-u51vza1r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T10:14:05Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=51ec8779-f667-4f68-853c-545679d761b9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "386a7730-6a16-4b18-b368-561762a8f7af", "address": "fa:16:3e:d6:e4:a1", "network": {"id": "f91dcb4b-184c-45d6-a0e9-285bb6bc3464", "bridge": "br-int", "label": "tempest-network-smoke--753987758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386a7730-6a", "ovs_interfaceid": "386a7730-6a16-4b18-b368-561762a8f7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.548 226326 DEBUG nova.network.os_vif_util [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "386a7730-6a16-4b18-b368-561762a8f7af", "address": "fa:16:3e:d6:e4:a1", "network": {"id": "f91dcb4b-184c-45d6-a0e9-285bb6bc3464", "bridge": "br-int", "label": "tempest-network-smoke--753987758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386a7730-6a", "ovs_interfaceid": "386a7730-6a16-4b18-b368-561762a8f7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.548 226326 DEBUG nova.network.os_vif_util [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:e4:a1,bridge_name='br-int',has_traffic_filtering=True,id=386a7730-6a16-4b18-b368-561762a8f7af,network=Network(f91dcb4b-184c-45d6-a0e9-285bb6bc3464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap386a7730-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.549 226326 DEBUG os_vif [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:e4:a1,bridge_name='br-int',has_traffic_filtering=True,id=386a7730-6a16-4b18-b368-561762a8f7af,network=Network(f91dcb4b-184c-45d6-a0e9-285bb6bc3464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap386a7730-6a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.550 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.550 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.551 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.554 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.554 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap386a7730-6a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.555 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap386a7730-6a, col_values=(('external_ids', {'iface-id': '386a7730-6a16-4b18-b368-561762a8f7af', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d6:e4:a1', 'vm-uuid': '51ec8779-f667-4f68-853c-545679d761b9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:14:11 compute-1 NetworkManager[49073]: <info>  [1769422451.5593] manager: (tap386a7730-6a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.565 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.566 226326 INFO os_vif [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:e4:a1,bridge_name='br-int',has_traffic_filtering=True,id=386a7730-6a16-4b18-b368-561762a8f7af,network=Network(f91dcb4b-184c-45d6-a0e9-285bb6bc3464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap386a7730-6a')
Jan 26 10:14:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 10:14:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:11.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.608 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.609 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.609 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No VIF found with MAC fa:16:3e:d6:e4:a1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.609 226326 INFO nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Using config drive
Jan 26 10:14:11 compute-1 nova_compute[226322]: 2026-01-26 10:14:11.633 226326 DEBUG nova.storage.rbd_utils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 51ec8779-f667-4f68-853c-545679d761b9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:14:11 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:14:11 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/49620544' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:14:11 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3883266146' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:14:12 compute-1 nova_compute[226322]: 2026-01-26 10:14:12.166 226326 DEBUG nova.network.neutron [req-eb7a636b-e5ae-49f8-bb40-ddfaa0576843 req-d58107c8-25f1-4e59-a671-4579ab4d606d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Updated VIF entry in instance network info cache for port 386a7730-6a16-4b18-b368-561762a8f7af. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 10:14:12 compute-1 nova_compute[226322]: 2026-01-26 10:14:12.167 226326 DEBUG nova.network.neutron [req-eb7a636b-e5ae-49f8-bb40-ddfaa0576843 req-d58107c8-25f1-4e59-a671-4579ab4d606d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Updating instance_info_cache with network_info: [{"id": "386a7730-6a16-4b18-b368-561762a8f7af", "address": "fa:16:3e:d6:e4:a1", "network": {"id": "f91dcb4b-184c-45d6-a0e9-285bb6bc3464", "bridge": "br-int", "label": "tempest-network-smoke--753987758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386a7730-6a", "ovs_interfaceid": "386a7730-6a16-4b18-b368-561762a8f7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:14:12 compute-1 nova_compute[226322]: 2026-01-26 10:14:12.188 226326 DEBUG oslo_concurrency.lockutils [req-eb7a636b-e5ae-49f8-bb40-ddfaa0576843 req-d58107c8-25f1-4e59-a671-4579ab4d606d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Releasing lock "refresh_cache-51ec8779-f667-4f68-853c-545679d761b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 10:14:12 compute-1 nova_compute[226322]: 2026-01-26 10:14:12.317 226326 INFO nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Creating config drive at /var/lib/nova/instances/51ec8779-f667-4f68-853c-545679d761b9/disk.config
Jan 26 10:14:12 compute-1 nova_compute[226322]: 2026-01-26 10:14:12.321 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/51ec8779-f667-4f68-853c-545679d761b9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg7ylmt3m execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:14:12 compute-1 nova_compute[226322]: 2026-01-26 10:14:12.446 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/51ec8779-f667-4f68-853c-545679d761b9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg7ylmt3m" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:14:12 compute-1 nova_compute[226322]: 2026-01-26 10:14:12.478 226326 DEBUG nova.storage.rbd_utils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 51ec8779-f667-4f68-853c-545679d761b9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:14:12 compute-1 nova_compute[226322]: 2026-01-26 10:14:12.482 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/51ec8779-f667-4f68-853c-545679d761b9/disk.config 51ec8779-f667-4f68-853c-545679d761b9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:14:12 compute-1 nova_compute[226322]: 2026-01-26 10:14:12.664 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/51ec8779-f667-4f68-853c-545679d761b9/disk.config 51ec8779-f667-4f68-853c-545679d761b9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.181s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:14:12 compute-1 nova_compute[226322]: 2026-01-26 10:14:12.665 226326 INFO nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Deleting local config drive /var/lib/nova/instances/51ec8779-f667-4f68-853c-545679d761b9/disk.config because it was imported into RBD.
Jan 26 10:14:12 compute-1 kernel: tap386a7730-6a: entered promiscuous mode
Jan 26 10:14:12 compute-1 NetworkManager[49073]: <info>  [1769422452.7146] manager: (tap386a7730-6a): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Jan 26 10:14:12 compute-1 ovn_controller[133670]: 2026-01-26T10:14:12Z|00056|binding|INFO|Claiming lport 386a7730-6a16-4b18-b368-561762a8f7af for this chassis.
Jan 26 10:14:12 compute-1 ovn_controller[133670]: 2026-01-26T10:14:12Z|00057|binding|INFO|386a7730-6a16-4b18-b368-561762a8f7af: Claiming fa:16:3e:d6:e4:a1 10.100.0.10
Jan 26 10:14:12 compute-1 nova_compute[226322]: 2026-01-26 10:14:12.715 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:12 compute-1 nova_compute[226322]: 2026-01-26 10:14:12.718 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:12 compute-1 nova_compute[226322]: 2026-01-26 10:14:12.720 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:12 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.727 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:e4:a1 10.100.0.10'], port_security=['fa:16:3e:d6:e4:a1 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1712540863', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '51ec8779-f667-4f68-853c-545679d761b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f91dcb4b-184c-45d6-a0e9-285bb6bc3464', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1712540863', 'neutron:project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '75a6a4cb-bd58-457c-b449-9db5f70f3f78', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=01b44e6c-3a91-48f0-92f1-3334bccbc3c9, chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], logical_port=386a7730-6a16-4b18-b368-561762a8f7af) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 10:14:12 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.729 143326 INFO neutron.agent.ovn.metadata.agent [-] Port 386a7730-6a16-4b18-b368-561762a8f7af in datapath f91dcb4b-184c-45d6-a0e9-285bb6bc3464 bound to our chassis
Jan 26 10:14:12 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.730 143326 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f91dcb4b-184c-45d6-a0e9-285bb6bc3464
Jan 26 10:14:12 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.740 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[986479cd-6a1a-4728-9e52-42b449f27485]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:14:12 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.740 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf91dcb4b-11 in ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 26 10:14:12 compute-1 systemd-machined[194876]: New machine qemu-4-instance-00000008.
Jan 26 10:14:12 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.742 229912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf91dcb4b-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 26 10:14:12 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.742 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[5dd5c6d3-07a3-45ca-83f5-49c9d0ec9696]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:14:12 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.742 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[a1cb344e-3d45-4fdc-afc0-8020c510fd94]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:14:12 compute-1 systemd-udevd[236120]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 10:14:12 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.753 143615 DEBUG oslo.privsep.daemon [-] privsep: reply[6cd6b589-adc0-42f9-93ab-9a44e448ee4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:14:12 compute-1 NetworkManager[49073]: <info>  [1769422452.7584] device (tap386a7730-6a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 10:14:12 compute-1 NetworkManager[49073]: <info>  [1769422452.7591] device (tap386a7730-6a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 10:14:12 compute-1 systemd[1]: Started Virtual Machine qemu-4-instance-00000008.
Jan 26 10:14:12 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.775 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[98d5f69a-18b1-444c-a415-c742b6064c96]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:14:12 compute-1 nova_compute[226322]: 2026-01-26 10:14:12.779 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:12 compute-1 ovn_controller[133670]: 2026-01-26T10:14:12Z|00058|binding|INFO|Setting lport 386a7730-6a16-4b18-b368-561762a8f7af ovn-installed in OVS
Jan 26 10:14:12 compute-1 ovn_controller[133670]: 2026-01-26T10:14:12Z|00059|binding|INFO|Setting lport 386a7730-6a16-4b18-b368-561762a8f7af up in Southbound
Jan 26 10:14:12 compute-1 nova_compute[226322]: 2026-01-26 10:14:12.784 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:12 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.803 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[b922dd51-b7ae-41be-a174-7deb07e1ce0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:14:12 compute-1 NetworkManager[49073]: <info>  [1769422452.8094] manager: (tapf91dcb4b-10): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Jan 26 10:14:12 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.809 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[4e65f143-6056-4a54-b3ed-5a2e8bf1f4a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:14:12 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.837 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[2bc33c62-799e-4570-af6c-06549661905c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:14:12 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.839 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[6c1677a7-01de-4581-8ab3-062d57d13b2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:14:12 compute-1 NetworkManager[49073]: <info>  [1769422452.8595] device (tapf91dcb4b-10): carrier: link connected
Jan 26 10:14:12 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.864 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[9658aa09-091c-4e67-9e48-6be98f1744ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:14:12 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.883 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[1dfa9bcf-e1d0-4bce-85f5-91104a66f2d7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf91dcb4b-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:0e:fb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441342, 'reachable_time': 28311, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236152, 'error': None, 'target': 'ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:14:12 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.897 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[c416f8bc-f822-4464-84f6-7fb6b5df901a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe80:efb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441342, 'tstamp': 441342}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236153, 'error': None, 'target': 'ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:14:12 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.916 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[7db23196-494d-4b7d-8984-3172b3f1f746]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf91dcb4b-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:0e:fb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441342, 'reachable_time': 28311, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236154, 'error': None, 'target': 'ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:14:12 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.946 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[0edc154b-6d8b-429e-98a2-6fdc47cd1915]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:14:12 compute-1 ceph-mon[80107]: pgmap v959: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Jan 26 10:14:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:14:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:14:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:14:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:13.002 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[dfa341f8-029f-4ed9-b6ea-68abbd8991a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:13.003 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf91dcb4b-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:13.003 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:13.004 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf91dcb4b-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.005 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:13 compute-1 kernel: tapf91dcb4b-10: entered promiscuous mode
Jan 26 10:14:13 compute-1 NetworkManager[49073]: <info>  [1769422453.0062] manager: (tapf91dcb4b-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.007 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:13.008 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf91dcb4b-10, col_values=(('external_ids', {'iface-id': '242dde27-5aff-4cac-b664-221ab4bfb94f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.009 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:13 compute-1 ovn_controller[133670]: 2026-01-26T10:14:13Z|00060|binding|INFO|Releasing lport 242dde27-5aff-4cac-b664-221ab4bfb94f from this chassis (sb_readonly=0)
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.010 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:13.010 143326 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f91dcb4b-184c-45d6-a0e9-285bb6bc3464.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f91dcb4b-184c-45d6-a0e9-285bb6bc3464.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:13.011 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[fd31ca85-575d-46ca-9904-de978ac4d435]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:13.012 143326 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]: global
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]:     log         /dev/log local0 debug
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]:     log-tag     haproxy-metadata-proxy-f91dcb4b-184c-45d6-a0e9-285bb6bc3464
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]:     user        root
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]:     group       root
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]:     maxconn     1024
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]:     pidfile     /var/lib/neutron/external/pids/f91dcb4b-184c-45d6-a0e9-285bb6bc3464.pid.haproxy
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]:     daemon
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]: 
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]: defaults
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]:     log global
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]:     mode http
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]:     option httplog
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]:     option dontlognull
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]:     option http-server-close
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]:     option forwardfor
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]:     retries                 3
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]:     timeout http-request    30s
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]:     timeout connect         30s
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]:     timeout client          32s
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]:     timeout server          32s
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]:     timeout http-keep-alive 30s
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]: 
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]: 
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]: listen listener
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]:     bind 169.254.169.254:80
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]:     http-request add-header X-OVN-Network-ID f91dcb4b-184c-45d6-a0e9-285bb6bc3464
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 26 10:14:13 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:13.013 143326 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464', 'env', 'PROCESS_TAG=haproxy-f91dcb4b-184c-45d6-a0e9-285bb6bc3464', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f91dcb4b-184c-45d6-a0e9-285bb6bc3464.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.022 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.257 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422453.257486, 51ec8779-f667-4f68-853c-545679d761b9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.258 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 51ec8779-f667-4f68-853c-545679d761b9] VM Started (Lifecycle Event)
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.276 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.280 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422453.2576222, 51ec8779-f667-4f68-853c-545679d761b9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.280 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 51ec8779-f667-4f68-853c-545679d761b9] VM Paused (Lifecycle Event)
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.297 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.300 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.307 226326 DEBUG nova.compute.manager [req-417921f6-960d-4084-8629-e02aa620a4c2 req-f39bde49-ef2c-4f96-b020-f4da5939b1fc b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Received event network-vif-plugged-386a7730-6a16-4b18-b368-561762a8f7af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.307 226326 DEBUG oslo_concurrency.lockutils [req-417921f6-960d-4084-8629-e02aa620a4c2 req-f39bde49-ef2c-4f96-b020-f4da5939b1fc b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "51ec8779-f667-4f68-853c-545679d761b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.307 226326 DEBUG oslo_concurrency.lockutils [req-417921f6-960d-4084-8629-e02aa620a4c2 req-f39bde49-ef2c-4f96-b020-f4da5939b1fc b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.307 226326 DEBUG oslo_concurrency.lockutils [req-417921f6-960d-4084-8629-e02aa620a4c2 req-f39bde49-ef2c-4f96-b020-f4da5939b1fc b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.308 226326 DEBUG nova.compute.manager [req-417921f6-960d-4084-8629-e02aa620a4c2 req-f39bde49-ef2c-4f96-b020-f4da5939b1fc b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Processing event network-vif-plugged-386a7730-6a16-4b18-b368-561762a8f7af _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.308 226326 DEBUG nova.compute.manager [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.312 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.314 226326 INFO nova.virt.libvirt.driver [-] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Instance spawned successfully.
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.315 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.327 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 51ec8779-f667-4f68-853c-545679d761b9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.328 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422453.310944, 51ec8779-f667-4f68-853c-545679d761b9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.328 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 51ec8779-f667-4f68-853c-545679d761b9] VM Resumed (Lifecycle Event)
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.347 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.351 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.351 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.352 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.352 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.353 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.353 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.356 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.414 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 51ec8779-f667-4f68-853c-545679d761b9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 10:14:13 compute-1 podman[236226]: 2026-01-26 10:14:13.428519349 +0000 UTC m=+0.110285320 container create 2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.433 226326 INFO nova.compute.manager [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Took 8.02 seconds to spawn the instance on the hypervisor.
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.433 226326 DEBUG nova.compute.manager [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 10:14:13 compute-1 podman[236226]: 2026-01-26 10:14:13.343381089 +0000 UTC m=+0.025147090 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 26 10:14:13 compute-1 systemd[1]: Started libpod-conmon-2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020.scope.
Jan 26 10:14:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:13.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:13 compute-1 systemd[1]: Started libcrun container.
Jan 26 10:14:13 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae0fb92a0feb3d3a75cace414af4b8a6d6b232b5f42090c39d7bdc2f5ca3bf09/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 10:14:13 compute-1 podman[236226]: 2026-01-26 10:14:13.515442211 +0000 UTC m=+0.197208202 container init 2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 26 10:14:13 compute-1 podman[236226]: 2026-01-26 10:14:13.520944425 +0000 UTC m=+0.202710396 container start 2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.529 226326 INFO nova.compute.manager [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Took 9.02 seconds to build instance.
Jan 26 10:14:13 compute-1 neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464[236241]: [NOTICE]   (236245) : New worker (236247) forked
Jan 26 10:14:13 compute-1 neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464[236241]: [NOTICE]   (236245) : Loading success.
Jan 26 10:14:13 compute-1 nova_compute[226322]: 2026-01-26 10:14:13.551 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:14:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 10:14:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:13.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 10:14:14 compute-1 ceph-mon[80107]: pgmap v960: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.162445) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422454162469, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 1162, "num_deletes": 501, "total_data_size": 1931285, "memory_usage": 1966512, "flush_reason": "Manual Compaction"}
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422454175496, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 1267861, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30075, "largest_seqno": 31232, "table_properties": {"data_size": 1263060, "index_size": 1877, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1861, "raw_key_size": 14321, "raw_average_key_size": 19, "raw_value_size": 1251336, "raw_average_value_size": 1704, "num_data_blocks": 82, "num_entries": 734, "num_filter_entries": 734, "num_deletions": 501, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769422388, "oldest_key_time": 1769422388, "file_creation_time": 1769422454, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 13088 microseconds, and 4614 cpu microseconds.
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.175529) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 1267861 bytes OK
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.175546) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.176993) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.177005) EVENT_LOG_v1 {"time_micros": 1769422454177001, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.177019) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 1924654, prev total WAL file size 1924654, number of live WAL files 2.
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.177772) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(1238KB)], [57(16MB)]
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422454177963, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 18552709, "oldest_snapshot_seqno": -1}
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5762 keys, 12337760 bytes, temperature: kUnknown
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422454298168, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 12337760, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12301098, "index_size": 21128, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14469, "raw_key_size": 148993, "raw_average_key_size": 25, "raw_value_size": 12198773, "raw_average_value_size": 2117, "num_data_blocks": 846, "num_entries": 5762, "num_filter_entries": 5762, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769422454, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.298389) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 12337760 bytes
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.301948) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 154.3 rd, 102.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 16.5 +0.0 blob) out(11.8 +0.0 blob), read-write-amplify(24.4) write-amplify(9.7) OK, records in: 6779, records dropped: 1017 output_compression: NoCompression
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.301969) EVENT_LOG_v1 {"time_micros": 1769422454301959, "job": 34, "event": "compaction_finished", "compaction_time_micros": 120242, "compaction_time_cpu_micros": 26053, "output_level": 6, "num_output_files": 1, "total_output_size": 12337760, "num_input_records": 6779, "num_output_records": 5762, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422454302306, "job": 34, "event": "table_file_deletion", "file_number": 59}
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422454305297, "job": 34, "event": "table_file_deletion", "file_number": 57}
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.177516) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.305346) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.305351) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.305352) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.305354) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:14:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.305355) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:14:14 compute-1 nova_compute[226322]: 2026-01-26 10:14:14.820 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:15 compute-1 sshd-session[236257]: Invalid user admin from 178.62.249.31 port 45700
Jan 26 10:14:15 compute-1 sshd-session[236257]: Connection closed by invalid user admin 178.62.249.31 port 45700 [preauth]
Jan 26 10:14:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000059s ======
Jan 26 10:14:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:15.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000059s
Jan 26 10:14:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:15.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:15 compute-1 nova_compute[226322]: 2026-01-26 10:14:15.603 226326 DEBUG nova.compute.manager [req-1907636a-56a8-4089-ad10-4c11149d8eea req-c4d4f985-c43f-445a-bfb9-56cc5e228e5a b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Received event network-vif-plugged-386a7730-6a16-4b18-b368-561762a8f7af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:14:15 compute-1 nova_compute[226322]: 2026-01-26 10:14:15.604 226326 DEBUG oslo_concurrency.lockutils [req-1907636a-56a8-4089-ad10-4c11149d8eea req-c4d4f985-c43f-445a-bfb9-56cc5e228e5a b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "51ec8779-f667-4f68-853c-545679d761b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:14:15 compute-1 nova_compute[226322]: 2026-01-26 10:14:15.606 226326 DEBUG oslo_concurrency.lockutils [req-1907636a-56a8-4089-ad10-4c11149d8eea req-c4d4f985-c43f-445a-bfb9-56cc5e228e5a b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:14:15 compute-1 nova_compute[226322]: 2026-01-26 10:14:15.606 226326 DEBUG oslo_concurrency.lockutils [req-1907636a-56a8-4089-ad10-4c11149d8eea req-c4d4f985-c43f-445a-bfb9-56cc5e228e5a b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:14:15 compute-1 nova_compute[226322]: 2026-01-26 10:14:15.607 226326 DEBUG nova.compute.manager [req-1907636a-56a8-4089-ad10-4c11149d8eea req-c4d4f985-c43f-445a-bfb9-56cc5e228e5a b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] No waiting events found dispatching network-vif-plugged-386a7730-6a16-4b18-b368-561762a8f7af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 10:14:15 compute-1 nova_compute[226322]: 2026-01-26 10:14:15.607 226326 WARNING nova.compute.manager [req-1907636a-56a8-4089-ad10-4c11149d8eea req-c4d4f985-c43f-445a-bfb9-56cc5e228e5a b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Received unexpected event network-vif-plugged-386a7730-6a16-4b18-b368-561762a8f7af for instance with vm_state active and task_state None.
Jan 26 10:14:16 compute-1 NetworkManager[49073]: <info>  [1769422456.3461] manager: (patch-br-int-to-provnet-94d9950f-5cf2-4813-9455-dd14377245f4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Jan 26 10:14:16 compute-1 NetworkManager[49073]: <info>  [1769422456.3470] manager: (patch-provnet-94d9950f-5cf2-4813-9455-dd14377245f4-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Jan 26 10:14:16 compute-1 nova_compute[226322]: 2026-01-26 10:14:16.346 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:16 compute-1 ovn_controller[133670]: 2026-01-26T10:14:16Z|00061|binding|INFO|Releasing lport 242dde27-5aff-4cac-b664-221ab4bfb94f from this chassis (sb_readonly=0)
Jan 26 10:14:16 compute-1 ovn_controller[133670]: 2026-01-26T10:14:16Z|00062|binding|INFO|Releasing lport 242dde27-5aff-4cac-b664-221ab4bfb94f from this chassis (sb_readonly=0)
Jan 26 10:14:16 compute-1 nova_compute[226322]: 2026-01-26 10:14:16.395 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:16 compute-1 nova_compute[226322]: 2026-01-26 10:14:16.398 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:16 compute-1 nova_compute[226322]: 2026-01-26 10:14:16.557 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:16 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:14:16 compute-1 ceph-mon[80107]: pgmap v961: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 102 op/s
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.074 226326 DEBUG oslo_concurrency.lockutils [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "51ec8779-f667-4f68-853c-545679d761b9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.075 226326 DEBUG oslo_concurrency.lockutils [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.075 226326 DEBUG oslo_concurrency.lockutils [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "51ec8779-f667-4f68-853c-545679d761b9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.075 226326 DEBUG oslo_concurrency.lockutils [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.076 226326 DEBUG oslo_concurrency.lockutils [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.077 226326 INFO nova.compute.manager [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Terminating instance
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.078 226326 DEBUG nova.compute.manager [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 10:14:17 compute-1 kernel: tap386a7730-6a (unregistering): left promiscuous mode
Jan 26 10:14:17 compute-1 NetworkManager[49073]: <info>  [1769422457.1976] device (tap386a7730-6a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 10:14:17 compute-1 ovn_controller[133670]: 2026-01-26T10:14:17Z|00063|binding|INFO|Releasing lport 386a7730-6a16-4b18-b368-561762a8f7af from this chassis (sb_readonly=0)
Jan 26 10:14:17 compute-1 ovn_controller[133670]: 2026-01-26T10:14:17Z|00064|binding|INFO|Setting lport 386a7730-6a16-4b18-b368-561762a8f7af down in Southbound
Jan 26 10:14:17 compute-1 ovn_controller[133670]: 2026-01-26T10:14:17Z|00065|binding|INFO|Removing iface tap386a7730-6a ovn-installed in OVS
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.229 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.233 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:17 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.238 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:e4:a1 10.100.0.10'], port_security=['fa:16:3e:d6:e4:a1 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1712540863', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '51ec8779-f667-4f68-853c-545679d761b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f91dcb4b-184c-45d6-a0e9-285bb6bc3464', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1712540863', 'neutron:project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '75a6a4cb-bd58-457c-b449-9db5f70f3f78', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.178'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=01b44e6c-3a91-48f0-92f1-3334bccbc3c9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], logical_port=386a7730-6a16-4b18-b368-561762a8f7af) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 10:14:17 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.239 143326 INFO neutron.agent.ovn.metadata.agent [-] Port 386a7730-6a16-4b18-b368-561762a8f7af in datapath f91dcb4b-184c-45d6-a0e9-285bb6bc3464 unbound from our chassis
Jan 26 10:14:17 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.240 143326 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f91dcb4b-184c-45d6-a0e9-285bb6bc3464, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 10:14:17 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.241 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[eba20e8b-44e9-46a1-8dcf-e5be974520a7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:14:17 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.241 143326 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464 namespace which is not needed anymore
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.251 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:17 compute-1 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000008.scope: Deactivated successfully.
Jan 26 10:14:17 compute-1 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000008.scope: Consumed 4.270s CPU time.
Jan 26 10:14:17 compute-1 systemd-machined[194876]: Machine qemu-4-instance-00000008 terminated.
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.319 226326 INFO nova.virt.libvirt.driver [-] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Instance destroyed successfully.
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.320 226326 DEBUG nova.objects.instance [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'resources' on Instance uuid 51ec8779-f667-4f68-853c-545679d761b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.334 226326 DEBUG nova.virt.libvirt.vif [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T10:14:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1744381875',display_name='tempest-TestNetworkBasicOps-server-1744381875',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1744381875',id=8,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAvyk/Go9GRUgKM9ccY9S+v3kz7mMcjasfiu2L0DFwS5oDB3yiDxzsS07sSdloLffH02y1mQmQVvZg5ozr00/t6RFvm10CNHliA9YweQcnIUE4iIcxPZtBU7hWU+AN6Ubw==',key_name='tempest-TestNetworkBasicOps-212494757',keypairs=<?>,launch_index=0,launched_at=2026-01-26T10:14:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-u51vza1r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T10:14:13Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=51ec8779-f667-4f68-853c-545679d761b9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "386a7730-6a16-4b18-b368-561762a8f7af", "address": "fa:16:3e:d6:e4:a1", "network": {"id": "f91dcb4b-184c-45d6-a0e9-285bb6bc3464", "bridge": "br-int", "label": "tempest-network-smoke--753987758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386a7730-6a", "ovs_interfaceid": "386a7730-6a16-4b18-b368-561762a8f7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.334 226326 DEBUG nova.network.os_vif_util [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "386a7730-6a16-4b18-b368-561762a8f7af", "address": "fa:16:3e:d6:e4:a1", "network": {"id": "f91dcb4b-184c-45d6-a0e9-285bb6bc3464", "bridge": "br-int", "label": "tempest-network-smoke--753987758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386a7730-6a", "ovs_interfaceid": "386a7730-6a16-4b18-b368-561762a8f7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.335 226326 DEBUG nova.network.os_vif_util [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:e4:a1,bridge_name='br-int',has_traffic_filtering=True,id=386a7730-6a16-4b18-b368-561762a8f7af,network=Network(f91dcb4b-184c-45d6-a0e9-285bb6bc3464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap386a7730-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.335 226326 DEBUG os_vif [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:e4:a1,bridge_name='br-int',has_traffic_filtering=True,id=386a7730-6a16-4b18-b368-561762a8f7af,network=Network(f91dcb4b-184c-45d6-a0e9-285bb6bc3464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap386a7730-6a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.336 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.336 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap386a7730-6a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.338 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.339 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.344 226326 INFO os_vif [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:e4:a1,bridge_name='br-int',has_traffic_filtering=True,id=386a7730-6a16-4b18-b368-561762a8f7af,network=Network(f91dcb4b-184c-45d6-a0e9-285bb6bc3464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap386a7730-6a')
Jan 26 10:14:17 compute-1 neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464[236241]: [NOTICE]   (236245) : haproxy version is 2.8.14-c23fe91
Jan 26 10:14:17 compute-1 neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464[236241]: [NOTICE]   (236245) : path to executable is /usr/sbin/haproxy
Jan 26 10:14:17 compute-1 neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464[236241]: [WARNING]  (236245) : Exiting Master process...
Jan 26 10:14:17 compute-1 neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464[236241]: [ALERT]    (236245) : Current worker (236247) exited with code 143 (Terminated)
Jan 26 10:14:17 compute-1 neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464[236241]: [WARNING]  (236245) : All workers exited. Exiting... (0)
Jan 26 10:14:17 compute-1 systemd[1]: libpod-2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020.scope: Deactivated successfully.
Jan 26 10:14:17 compute-1 podman[236296]: 2026-01-26 10:14:17.392377362 +0000 UTC m=+0.050704693 container died 2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 26 10:14:17 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020-userdata-shm.mount: Deactivated successfully.
Jan 26 10:14:17 compute-1 systemd[1]: var-lib-containers-storage-overlay-ae0fb92a0feb3d3a75cace414af4b8a6d6b232b5f42090c39d7bdc2f5ca3bf09-merged.mount: Deactivated successfully.
Jan 26 10:14:17 compute-1 podman[236296]: 2026-01-26 10:14:17.42919543 +0000 UTC m=+0.087522761 container cleanup 2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 26 10:14:17 compute-1 systemd[1]: libpod-conmon-2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020.scope: Deactivated successfully.
Jan 26 10:14:17 compute-1 podman[236343]: 2026-01-26 10:14:17.492616981 +0000 UTC m=+0.045620132 container remove 2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 10:14:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:17.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:17 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.498 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[36214e9f-d97f-4b94-a661-e539f4bf6900]: (4, ('Mon Jan 26 10:14:17 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464 (2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020)\n2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020\nMon Jan 26 10:14:17 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464 (2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020)\n2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:14:17 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.500 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[de535640-afd2-4dbd-8092-86da3e2e813f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:14:17 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.501 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf91dcb4b-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.502 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:17 compute-1 kernel: tapf91dcb4b-10: left promiscuous mode
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.515 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:17 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.521 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[4125d8f9-fa97-445f-b0ea-4022e6b6f3d7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:14:17 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.541 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[b39eab59-1a73-4e57-8a10-975f5cbed1f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:14:17 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.542 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[81046efe-1171-40de-8342-ae21e9cf01c1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:14:17 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.562 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[8bf46682-64b8-4fdd-9955-029e01e3e7e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441336, 'reachable_time': 43919, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236359, 'error': None, 'target': 'ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:14:17 compute-1 systemd[1]: run-netns-ovnmeta\x2df91dcb4b\x2d184c\x2d45d6\x2da0e9\x2d285bb6bc3464.mount: Deactivated successfully.
Jan 26 10:14:17 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.565 143615 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 26 10:14:17 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.565 143615 DEBUG oslo.privsep.daemon [-] privsep: reply[52bc1476-2007-4c26-bb54-439faa430d50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:14:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:17.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.665 226326 DEBUG nova.compute.manager [req-ae1e18f9-a5ef-457c-89f6-ca94869baf3a req-89372e82-9d81-444f-a8cc-a59a98645539 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Received event network-changed-386a7730-6a16-4b18-b368-561762a8f7af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.665 226326 DEBUG nova.compute.manager [req-ae1e18f9-a5ef-457c-89f6-ca94869baf3a req-89372e82-9d81-444f-a8cc-a59a98645539 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Refreshing instance network info cache due to event network-changed-386a7730-6a16-4b18-b368-561762a8f7af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.666 226326 DEBUG oslo_concurrency.lockutils [req-ae1e18f9-a5ef-457c-89f6-ca94869baf3a req-89372e82-9d81-444f-a8cc-a59a98645539 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "refresh_cache-51ec8779-f667-4f68-853c-545679d761b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.666 226326 DEBUG oslo_concurrency.lockutils [req-ae1e18f9-a5ef-457c-89f6-ca94869baf3a req-89372e82-9d81-444f-a8cc-a59a98645539 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquired lock "refresh_cache-51ec8779-f667-4f68-853c-545679d761b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.666 226326 DEBUG nova.network.neutron [req-ae1e18f9-a5ef-457c-89f6-ca94869baf3a req-89372e82-9d81-444f-a8cc-a59a98645539 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Refreshing network info cache for port 386a7730-6a16-4b18-b368-561762a8f7af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.741 226326 INFO nova.virt.libvirt.driver [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Deleting instance files /var/lib/nova/instances/51ec8779-f667-4f68-853c-545679d761b9_del
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.742 226326 INFO nova.virt.libvirt.driver [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Deletion of /var/lib/nova/instances/51ec8779-f667-4f68-853c-545679d761b9_del complete
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.813 226326 INFO nova.compute.manager [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Took 0.73 seconds to destroy the instance on the hypervisor.
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.814 226326 DEBUG oslo.service.loopingcall [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.814 226326 DEBUG nova.compute.manager [-] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 10:14:17 compute-1 nova_compute[226322]: 2026-01-26 10:14:17.814 226326 DEBUG nova.network.neutron [-] [instance: 51ec8779-f667-4f68-853c-545679d761b9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 10:14:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:14:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:14:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:14:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:14:18 compute-1 ceph-mon[80107]: pgmap v962: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Jan 26 10:14:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:14:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 10:14:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:19.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 10:14:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 10:14:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:19.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 10:14:19 compute-1 nova_compute[226322]: 2026-01-26 10:14:19.820 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:19 compute-1 nova_compute[226322]: 2026-01-26 10:14:19.902 226326 DEBUG nova.compute.manager [req-04875165-a0ef-4565-a618-107e586658b7 req-8bda776c-de0b-4179-9c23-dda2d541e618 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Received event network-vif-unplugged-386a7730-6a16-4b18-b368-561762a8f7af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:14:19 compute-1 nova_compute[226322]: 2026-01-26 10:14:19.903 226326 DEBUG oslo_concurrency.lockutils [req-04875165-a0ef-4565-a618-107e586658b7 req-8bda776c-de0b-4179-9c23-dda2d541e618 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "51ec8779-f667-4f68-853c-545679d761b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:14:19 compute-1 nova_compute[226322]: 2026-01-26 10:14:19.903 226326 DEBUG oslo_concurrency.lockutils [req-04875165-a0ef-4565-a618-107e586658b7 req-8bda776c-de0b-4179-9c23-dda2d541e618 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:14:19 compute-1 nova_compute[226322]: 2026-01-26 10:14:19.904 226326 DEBUG oslo_concurrency.lockutils [req-04875165-a0ef-4565-a618-107e586658b7 req-8bda776c-de0b-4179-9c23-dda2d541e618 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:14:19 compute-1 nova_compute[226322]: 2026-01-26 10:14:19.904 226326 DEBUG nova.compute.manager [req-04875165-a0ef-4565-a618-107e586658b7 req-8bda776c-de0b-4179-9c23-dda2d541e618 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] No waiting events found dispatching network-vif-unplugged-386a7730-6a16-4b18-b368-561762a8f7af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 10:14:19 compute-1 nova_compute[226322]: 2026-01-26 10:14:19.904 226326 DEBUG nova.compute.manager [req-04875165-a0ef-4565-a618-107e586658b7 req-8bda776c-de0b-4179-9c23-dda2d541e618 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Received event network-vif-unplugged-386a7730-6a16-4b18-b368-561762a8f7af for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 26 10:14:20 compute-1 ceph-mon[80107]: pgmap v963: 353 pgs: 353 active+clean; 41 MiB data, 261 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 1.8 MiB/s wr, 128 op/s
Jan 26 10:14:20 compute-1 sudo[236362]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:14:20 compute-1 sudo[236362]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:14:20 compute-1 sudo[236362]: pam_unix(sudo:session): session closed for user root
Jan 26 10:14:20 compute-1 sudo[236387]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:14:20 compute-1 sudo[236387]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:14:20 compute-1 sudo[236387]: pam_unix(sudo:session): session closed for user root
Jan 26 10:14:21 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:14:21 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:14:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:21.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:21.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:21 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:14:21 compute-1 nova_compute[226322]: 2026-01-26 10:14:21.758 226326 DEBUG nova.network.neutron [req-ae1e18f9-a5ef-457c-89f6-ca94869baf3a req-89372e82-9d81-444f-a8cc-a59a98645539 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Updated VIF entry in instance network info cache for port 386a7730-6a16-4b18-b368-561762a8f7af. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 10:14:21 compute-1 nova_compute[226322]: 2026-01-26 10:14:21.759 226326 DEBUG nova.network.neutron [req-ae1e18f9-a5ef-457c-89f6-ca94869baf3a req-89372e82-9d81-444f-a8cc-a59a98645539 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Updating instance_info_cache with network_info: [{"id": "386a7730-6a16-4b18-b368-561762a8f7af", "address": "fa:16:3e:d6:e4:a1", "network": {"id": "f91dcb4b-184c-45d6-a0e9-285bb6bc3464", "bridge": "br-int", "label": "tempest-network-smoke--753987758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386a7730-6a", "ovs_interfaceid": "386a7730-6a16-4b18-b368-561762a8f7af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:14:21 compute-1 nova_compute[226322]: 2026-01-26 10:14:21.777 226326 DEBUG oslo_concurrency.lockutils [req-ae1e18f9-a5ef-457c-89f6-ca94869baf3a req-89372e82-9d81-444f-a8cc-a59a98645539 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Releasing lock "refresh_cache-51ec8779-f667-4f68-853c-545679d761b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 10:14:21 compute-1 nova_compute[226322]: 2026-01-26 10:14:21.921 226326 DEBUG nova.network.neutron [-] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:14:21 compute-1 nova_compute[226322]: 2026-01-26 10:14:21.939 226326 INFO nova.compute.manager [-] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Took 4.12 seconds to deallocate network for instance.
Jan 26 10:14:21 compute-1 nova_compute[226322]: 2026-01-26 10:14:21.994 226326 DEBUG nova.compute.manager [req-e46b7dd0-ac26-4a95-b391-a57697225b73 req-14264520-c892-48b6-b7e9-8a886cf29dd8 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Received event network-vif-plugged-386a7730-6a16-4b18-b368-561762a8f7af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:14:21 compute-1 nova_compute[226322]: 2026-01-26 10:14:21.994 226326 DEBUG oslo_concurrency.lockutils [req-e46b7dd0-ac26-4a95-b391-a57697225b73 req-14264520-c892-48b6-b7e9-8a886cf29dd8 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "51ec8779-f667-4f68-853c-545679d761b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:14:21 compute-1 nova_compute[226322]: 2026-01-26 10:14:21.995 226326 DEBUG oslo_concurrency.lockutils [req-e46b7dd0-ac26-4a95-b391-a57697225b73 req-14264520-c892-48b6-b7e9-8a886cf29dd8 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:14:21 compute-1 nova_compute[226322]: 2026-01-26 10:14:21.995 226326 DEBUG oslo_concurrency.lockutils [req-e46b7dd0-ac26-4a95-b391-a57697225b73 req-14264520-c892-48b6-b7e9-8a886cf29dd8 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:14:21 compute-1 nova_compute[226322]: 2026-01-26 10:14:21.996 226326 DEBUG nova.compute.manager [req-e46b7dd0-ac26-4a95-b391-a57697225b73 req-14264520-c892-48b6-b7e9-8a886cf29dd8 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] No waiting events found dispatching network-vif-plugged-386a7730-6a16-4b18-b368-561762a8f7af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 10:14:21 compute-1 nova_compute[226322]: 2026-01-26 10:14:21.996 226326 WARNING nova.compute.manager [req-e46b7dd0-ac26-4a95-b391-a57697225b73 req-14264520-c892-48b6-b7e9-8a886cf29dd8 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Received unexpected event network-vif-plugged-386a7730-6a16-4b18-b368-561762a8f7af for instance with vm_state active and task_state deleting.
Jan 26 10:14:21 compute-1 nova_compute[226322]: 2026-01-26 10:14:21.999 226326 DEBUG oslo_concurrency.lockutils [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:14:22 compute-1 nova_compute[226322]: 2026-01-26 10:14:21.999 226326 DEBUG oslo_concurrency.lockutils [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:14:22 compute-1 nova_compute[226322]: 2026-01-26 10:14:22.053 226326 DEBUG oslo_concurrency.processutils [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:14:22 compute-1 nova_compute[226322]: 2026-01-26 10:14:22.338 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:22 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:14:22 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:14:22 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:14:22 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:14:22 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:14:22 compute-1 ceph-mon[80107]: pgmap v964: 353 pgs: 353 active+clean; 41 MiB data, 261 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 1.8 MiB/s wr, 128 op/s
Jan 26 10:14:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:14:22 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3329353095' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:14:22 compute-1 nova_compute[226322]: 2026-01-26 10:14:22.521 226326 DEBUG oslo_concurrency.processutils [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:14:22 compute-1 nova_compute[226322]: 2026-01-26 10:14:22.528 226326 DEBUG nova.compute.provider_tree [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:14:22 compute-1 nova_compute[226322]: 2026-01-26 10:14:22.624 226326 DEBUG nova.scheduler.client.report [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:14:22 compute-1 nova_compute[226322]: 2026-01-26 10:14:22.645 226326 DEBUG oslo_concurrency.lockutils [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.645s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:14:22 compute-1 nova_compute[226322]: 2026-01-26 10:14:22.679 226326 INFO nova.scheduler.client.report [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Deleted allocations for instance 51ec8779-f667-4f68-853c-545679d761b9
Jan 26 10:14:22 compute-1 nova_compute[226322]: 2026-01-26 10:14:22.747 226326 DEBUG oslo_concurrency.lockutils [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.672s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:14:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:14:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:14:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:14:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:14:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:23.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 10:14:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:23.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 10:14:23 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3329353095' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:14:24 compute-1 podman[236468]: 2026-01-26 10:14:24.281927039 +0000 UTC m=+0.055282600 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 26 10:14:24 compute-1 sudo[236488]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:14:24 compute-1 sudo[236488]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:14:24 compute-1 sudo[236488]: pam_unix(sudo:session): session closed for user root
Jan 26 10:14:24 compute-1 ceph-mon[80107]: pgmap v965: 353 pgs: 353 active+clean; 41 MiB data, 261 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 101 op/s
Jan 26 10:14:24 compute-1 nova_compute[226322]: 2026-01-26 10:14:24.783 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:24 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:24.783 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 10:14:24 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:24.785 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 10:14:24 compute-1 nova_compute[226322]: 2026-01-26 10:14:24.821 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:25.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:25.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:26 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:14:27 compute-1 ceph-mon[80107]: pgmap v966: 353 pgs: 353 active+clean; 41 MiB data, 261 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 101 op/s
Jan 26 10:14:27 compute-1 nova_compute[226322]: 2026-01-26 10:14:27.340 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:27.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:27.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:27 compute-1 sudo[236515]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:14:27 compute-1 sudo[236515]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:14:27 compute-1 sudo[236515]: pam_unix(sudo:session): session closed for user root
Jan 26 10:14:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:14:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:14:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:14:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:14:28 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:14:28 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:14:28 compute-1 ceph-mon[80107]: pgmap v967: 353 pgs: 353 active+clean; 41 MiB data, 261 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Jan 26 10:14:28 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:28.789 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:14:29 compute-1 sshd-session[236540]: Invalid user admin from 152.42.136.110 port 54482
Jan 26 10:14:29 compute-1 sshd-session[236540]: Connection closed by invalid user admin 152.42.136.110 port 54482 [preauth]
Jan 26 10:14:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:29.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:14:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:29.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:14:29 compute-1 nova_compute[226322]: 2026-01-26 10:14:29.824 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:30 compute-1 ceph-mon[80107]: pgmap v968: 353 pgs: 353 active+clean; 41 MiB data, 261 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Jan 26 10:14:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:31.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:31.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:31 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:14:32 compute-1 nova_compute[226322]: 2026-01-26 10:14:32.318 226326 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769422457.316042, 51ec8779-f667-4f68-853c-545679d761b9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 10:14:32 compute-1 nova_compute[226322]: 2026-01-26 10:14:32.318 226326 INFO nova.compute.manager [-] [instance: 51ec8779-f667-4f68-853c-545679d761b9] VM Stopped (Lifecycle Event)
Jan 26 10:14:32 compute-1 nova_compute[226322]: 2026-01-26 10:14:32.341 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:32 compute-1 nova_compute[226322]: 2026-01-26 10:14:32.349 226326 DEBUG nova.compute.manager [None req-daa35117-bc0e-4564-889e-3cdff8461da1 - - - - - -] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 10:14:32 compute-1 ceph-mon[80107]: pgmap v969: 353 pgs: 353 active+clean; 41 MiB data, 261 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:14:32 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1730871621' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:14:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:14:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:14:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:14:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:14:33 compute-1 nova_compute[226322]: 2026-01-26 10:14:33.378 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:14:33 compute-1 nova_compute[226322]: 2026-01-26 10:14:33.379 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:14:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:33.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:33.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:33 compute-1 nova_compute[226322]: 2026-01-26 10:14:33.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:14:33 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3807772300' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:14:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:14:34 compute-1 nova_compute[226322]: 2026-01-26 10:14:34.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:14:34 compute-1 nova_compute[226322]: 2026-01-26 10:14:34.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:14:34 compute-1 nova_compute[226322]: 2026-01-26 10:14:34.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:14:34 compute-1 nova_compute[226322]: 2026-01-26 10:14:34.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:14:34 compute-1 nova_compute[226322]: 2026-01-26 10:14:34.715 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 10:14:34 compute-1 nova_compute[226322]: 2026-01-26 10:14:34.826 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:34 compute-1 ceph-mon[80107]: pgmap v970: 353 pgs: 353 active+clean; 41 MiB data, 261 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:14:34 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2674525483' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:14:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:35.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:35.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:35 compute-1 nova_compute[226322]: 2026-01-26 10:14:35.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:14:35 compute-1 nova_compute[226322]: 2026-01-26 10:14:35.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:14:36 compute-1 nova_compute[226322]: 2026-01-26 10:14:36.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:14:36 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:14:37 compute-1 nova_compute[226322]: 2026-01-26 10:14:37.342 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:37.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:14:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:37.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:14:37 compute-1 nova_compute[226322]: 2026-01-26 10:14:37.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:14:37 compute-1 nova_compute[226322]: 2026-01-26 10:14:37.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:14:37 compute-1 nova_compute[226322]: 2026-01-26 10:14:37.710 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:14:37 compute-1 nova_compute[226322]: 2026-01-26 10:14:37.710 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:14:37 compute-1 nova_compute[226322]: 2026-01-26 10:14:37.710 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:14:37 compute-1 nova_compute[226322]: 2026-01-26 10:14:37.710 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:14:37 compute-1 nova_compute[226322]: 2026-01-26 10:14:37.710 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:14:37 compute-1 ceph-mon[80107]: pgmap v971: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Jan 26 10:14:37 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1976849613' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:14:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:14:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:14:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:14:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:14:38 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:14:38 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3798574157' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:14:38 compute-1 nova_compute[226322]: 2026-01-26 10:14:38.177 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:14:38 compute-1 nova_compute[226322]: 2026-01-26 10:14:38.340 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:14:38 compute-1 nova_compute[226322]: 2026-01-26 10:14:38.342 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4864MB free_disk=59.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:14:38 compute-1 nova_compute[226322]: 2026-01-26 10:14:38.342 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:14:38 compute-1 nova_compute[226322]: 2026-01-26 10:14:38.343 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:14:38 compute-1 nova_compute[226322]: 2026-01-26 10:14:38.401 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:14:38 compute-1 nova_compute[226322]: 2026-01-26 10:14:38.402 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:14:38 compute-1 nova_compute[226322]: 2026-01-26 10:14:38.414 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:14:38 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2833343807' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:14:38 compute-1 ceph-mon[80107]: pgmap v972: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Jan 26 10:14:38 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2098853532' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:14:38 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3798574157' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:14:38 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/4096772364' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:14:38 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:14:38 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/767898149' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:14:38 compute-1 nova_compute[226322]: 2026-01-26 10:14:38.859 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:14:38 compute-1 nova_compute[226322]: 2026-01-26 10:14:38.869 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:14:38 compute-1 nova_compute[226322]: 2026-01-26 10:14:38.889 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:14:38 compute-1 nova_compute[226322]: 2026-01-26 10:14:38.916 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:14:38 compute-1 nova_compute[226322]: 2026-01-26 10:14:38.917 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:14:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:39.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 10:14:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:39.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 10:14:39 compute-1 nova_compute[226322]: 2026-01-26 10:14:39.827 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:39 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/767898149' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:14:40 compute-1 podman[236593]: 2026-01-26 10:14:40.340450021 +0000 UTC m=+0.120972767 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 10:14:41 compute-1 ceph-mon[80107]: pgmap v973: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 1.8 MiB/s wr, 31 op/s
Jan 26 10:14:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:41.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:41.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:41 compute-1 sshd-session[236619]: Invalid user centos from 104.248.198.71 port 50770
Jan 26 10:14:41 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:14:41 compute-1 sshd-session[236619]: Connection closed by invalid user centos 104.248.198.71 port 50770 [preauth]
Jan 26 10:14:42 compute-1 ceph-mon[80107]: pgmap v974: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.8 MiB/s wr, 31 op/s
Jan 26 10:14:42 compute-1 nova_compute[226322]: 2026-01-26 10:14:42.343 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:14:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:14:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:14:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:14:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:43.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:43.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:44 compute-1 sudo[236623]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:14:44 compute-1 sudo[236623]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:14:44 compute-1 sudo[236623]: pam_unix(sudo:session): session closed for user root
Jan 26 10:14:44 compute-1 ceph-mon[80107]: pgmap v975: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.8 MiB/s wr, 31 op/s
Jan 26 10:14:44 compute-1 nova_compute[226322]: 2026-01-26 10:14:44.874 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:14:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:45.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:14:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:14:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:45.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:14:46 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:14:47 compute-1 nova_compute[226322]: 2026-01-26 10:14:47.347 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:47 compute-1 ceph-mon[80107]: pgmap v976: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 1.8 MiB/s wr, 128 op/s
Jan 26 10:14:47 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/4239229588' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:14:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:47.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:47.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:14:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:14:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:14:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:14:48 compute-1 ceph-mon[80107]: pgmap v977: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 101 op/s
Jan 26 10:14:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:14:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:49.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:49.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:49 compute-1 nova_compute[226322]: 2026-01-26 10:14:49.876 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:50 compute-1 ceph-mon[80107]: pgmap v978: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 101 op/s
Jan 26 10:14:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:51.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:14:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:51.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:14:51 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:14:52 compute-1 nova_compute[226322]: 2026-01-26 10:14:52.373 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:52 compute-1 ceph-mon[80107]: pgmap v979: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 98 op/s
Jan 26 10:14:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:14:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:14:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:14:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:14:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:53.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:53 compute-1 nova_compute[226322]: 2026-01-26 10:14:53.647 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:53.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:53 compute-1 nova_compute[226322]: 2026-01-26 10:14:53.718 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:53.937 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:14:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:53.938 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:14:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:14:53.938 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:14:54 compute-1 ceph-mon[80107]: pgmap v980: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 98 op/s
Jan 26 10:14:54 compute-1 nova_compute[226322]: 2026-01-26 10:14:54.878 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:55 compute-1 podman[236654]: 2026-01-26 10:14:55.269537715 +0000 UTC m=+0.046461203 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 26 10:14:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:14:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:55.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:14:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:55.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:56 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:14:56 compute-1 ceph-mon[80107]: pgmap v981: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 98 op/s
Jan 26 10:14:57 compute-1 nova_compute[226322]: 2026-01-26 10:14:57.377 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:14:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:57.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 10:14:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:57.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 10:14:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:14:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:14:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:14:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:14:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 10:14:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/96791862' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:14:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 10:14:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/96791862' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:14:58 compute-1 ceph-mon[80107]: pgmap v982: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:14:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/96791862' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:14:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/96791862' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:14:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:59.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:14:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:14:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:59.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:14:59 compute-1 nova_compute[226322]: 2026-01-26 10:14:59.880 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:15:00 compute-1 ceph-mon[80107]: pgmap v983: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:15:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 10:15:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:01.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 10:15:01 compute-1 sshd-session[236678]: Invalid user admin from 178.62.249.31 port 52458
Jan 26 10:15:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 10:15:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:01.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 10:15:01 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:15:01 compute-1 sshd-session[236678]: Connection closed by invalid user admin 178.62.249.31 port 52458 [preauth]
Jan 26 10:15:02 compute-1 nova_compute[226322]: 2026-01-26 10:15:02.381 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:15:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:15:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:15:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:15:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:15:03 compute-1 ceph-mon[80107]: pgmap v984: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:15:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:03.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 10:15:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:03.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 10:15:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:15:04 compute-1 sudo[236682]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:15:04 compute-1 sudo[236682]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:15:04 compute-1 sudo[236682]: pam_unix(sudo:session): session closed for user root
Jan 26 10:15:04 compute-1 nova_compute[226322]: 2026-01-26 10:15:04.882 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:15:05 compute-1 ceph-mon[80107]: pgmap v985: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:15:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 10:15:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:05.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 10:15:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:15:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:05.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:15:06 compute-1 ceph-mon[80107]: pgmap v986: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:15:06 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:15:07 compute-1 nova_compute[226322]: 2026-01-26 10:15:07.385 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:15:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:07.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:15:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:07.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:15:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:15:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:15:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:15:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:15:08 compute-1 ceph-mon[80107]: pgmap v987: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:15:09 compute-1 sshd-session[236710]: Invalid user admin from 152.42.136.110 port 57066
Jan 26 10:15:09 compute-1 sshd-session[236710]: Connection closed by invalid user admin 152.42.136.110 port 57066 [preauth]
Jan 26 10:15:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:09.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:09.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:09 compute-1 nova_compute[226322]: 2026-01-26 10:15:09.884 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:15:10 compute-1 ceph-mon[80107]: pgmap v988: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:15:11 compute-1 podman[236713]: 2026-01-26 10:15:11.305334104 +0000 UTC m=+0.086686652 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 10:15:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:11.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:15:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:11.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:15:11 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:15:12 compute-1 nova_compute[226322]: 2026-01-26 10:15:12.387 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:15:12 compute-1 ceph-mon[80107]: pgmap v989: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:15:12 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1075107949' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:15:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:15:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:15:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:15:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:15:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 10:15:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:13.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 10:15:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:13.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:14 compute-1 nova_compute[226322]: 2026-01-26 10:15:14.886 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:15:14 compute-1 ceph-mon[80107]: pgmap v990: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:15:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:15.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:15:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:15.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:15:16 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:15:16 compute-1 ceph-mon[80107]: pgmap v991: 353 pgs: 353 active+clean; 88 MiB data, 276 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Jan 26 10:15:16 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/559084737' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:15:17 compute-1 nova_compute[226322]: 2026-01-26 10:15:17.390 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:15:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:17.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:17.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:15:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:15:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:15:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:15:18 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3526497005' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:15:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:19.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:19.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:19 compute-1 ceph-mon[80107]: pgmap v992: 353 pgs: 353 active+clean; 88 MiB data, 276 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Jan 26 10:15:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:15:19 compute-1 nova_compute[226322]: 2026-01-26 10:15:19.887 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:15:20 compute-1 ceph-mon[80107]: pgmap v993: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 1.8 MiB/s wr, 38 op/s
Jan 26 10:15:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:21.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:21.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:21 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:15:22 compute-1 nova_compute[226322]: 2026-01-26 10:15:22.393 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:15:22 compute-1 ceph-mon[80107]: pgmap v994: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Jan 26 10:15:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:15:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:15:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:15:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:15:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:15:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:23.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:15:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:15:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:23.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:15:24 compute-1 nova_compute[226322]: 2026-01-26 10:15:24.889 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:15:24 compute-1 sudo[236748]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:15:24 compute-1 sudo[236748]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:15:24 compute-1 sudo[236748]: pam_unix(sudo:session): session closed for user root
Jan 26 10:15:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:25.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:25 compute-1 ceph-mon[80107]: pgmap v995: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Jan 26 10:15:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:15:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:25.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:15:25 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:15:25.968 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 10:15:25 compute-1 nova_compute[226322]: 2026-01-26 10:15:25.968 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:15:25 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:15:25.969 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 10:15:25 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:15:25.971 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:15:26 compute-1 podman[236774]: 2026-01-26 10:15:26.273397376 +0000 UTC m=+0.051422957 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 10:15:26 compute-1 ceph-mon[80107]: pgmap v996: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 102 op/s
Jan 26 10:15:26 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:15:27 compute-1 nova_compute[226322]: 2026-01-26 10:15:27.396 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:15:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:27.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:27.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:27 compute-1 sudo[236794]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:15:27 compute-1 sudo[236794]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:15:27 compute-1 sudo[236794]: pam_unix(sudo:session): session closed for user root
Jan 26 10:15:27 compute-1 sudo[236819]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 26 10:15:27 compute-1 sudo[236819]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:15:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:15:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:15:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:15:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:15:28 compute-1 ovn_controller[133670]: 2026-01-26T10:15:28Z|00066|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 26 10:15:29 compute-1 ceph-mon[80107]: pgmap v997: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 26 10:15:29 compute-1 podman[236917]: 2026-01-26 10:15:29.55440675 +0000 UTC m=+1.163738272 container exec 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 10:15:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:29.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:29.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:29 compute-1 sshd-session[236931]: Invalid user centos from 104.248.198.71 port 56806
Jan 26 10:15:29 compute-1 podman[236917]: 2026-01-26 10:15:29.816085996 +0000 UTC m=+1.425417508 container exec_died 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Jan 26 10:15:29 compute-1 nova_compute[226322]: 2026-01-26 10:15:29.891 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:15:29 compute-1 sshd-session[236931]: Connection closed by invalid user centos 104.248.198.71 port 56806 [preauth]
Jan 26 10:15:31 compute-1 podman[237059]: 2026-01-26 10:15:31.00409731 +0000 UTC m=+0.176667892 container exec 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 10:15:31 compute-1 ceph-mon[80107]: pgmap v998: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 75 op/s
Jan 26 10:15:31 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:15:31 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:15:31 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 26 10:15:31 compute-1 podman[237084]: 2026-01-26 10:15:31.087909072 +0000 UTC m=+0.062935853 container exec_died 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 10:15:31 compute-1 podman[237059]: 2026-01-26 10:15:31.299179079 +0000 UTC m=+0.471749681 container exec_died 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 10:15:31 compute-1 podman[237133]: 2026-01-26 10:15:31.602691137 +0000 UTC m=+0.105487354 container exec 2481bf60245d7c2e187aaa8c0a64c69d854b50d0db3801811d86e7df60b7c5b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 10:15:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:15:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:31.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:15:31 compute-1 podman[237153]: 2026-01-26 10:15:31.674892552 +0000 UTC m=+0.052953799 container exec_died 2481bf60245d7c2e187aaa8c0a64c69d854b50d0db3801811d86e7df60b7c5b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 10:15:31 compute-1 podman[237133]: 2026-01-26 10:15:31.713661922 +0000 UTC m=+0.216458099 container exec_died 2481bf60245d7c2e187aaa8c0a64c69d854b50d0db3801811d86e7df60b7c5b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 10:15:31 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:15:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:31.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:32 compute-1 podman[237197]: 2026-01-26 10:15:32.083163695 +0000 UTC m=+0.232575370 container exec 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 10:15:32 compute-1 ceph-mon[80107]: pgmap v999: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 64 op/s
Jan 26 10:15:32 compute-1 podman[237197]: 2026-01-26 10:15:32.331073853 +0000 UTC m=+0.480485418 container exec_died 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 10:15:32 compute-1 nova_compute[226322]: 2026-01-26 10:15:32.398 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:15:32 compute-1 podman[237263]: 2026-01-26 10:15:32.749825824 +0000 UTC m=+0.229444705 container exec 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, release=1793, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, version=2.2.4, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, name=keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 26 10:15:32 compute-1 podman[237284]: 2026-01-26 10:15:32.821850773 +0000 UTC m=+0.052793804 container exec_died 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=keepalived for Ceph, release=1793, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 10:15:32 compute-1 podman[237263]: 2026-01-26 10:15:32.915386 +0000 UTC m=+0.395004861 container exec_died 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, vendor=Red Hat, Inc., version=2.2.4, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, vcs-type=git, architecture=x86_64, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, io.k8s.display-name=Keepalived on RHEL 9, release=1793, description=keepalived for Ceph)
Jan 26 10:15:32 compute-1 sudo[236819]: pam_unix(sudo:session): session closed for user root
Jan 26 10:15:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:15:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:15:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:15:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:15:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:33.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:33 compute-1 sudo[237298]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:15:33 compute-1 sudo[237298]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:15:33 compute-1 sudo[237298]: pam_unix(sudo:session): session closed for user root
Jan 26 10:15:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:33.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:33 compute-1 sudo[237323]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:15:33 compute-1 sudo[237323]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:15:33 compute-1 nova_compute[226322]: 2026-01-26 10:15:33.919 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:15:34 compute-1 sudo[237323]: pam_unix(sudo:session): session closed for user root
Jan 26 10:15:34 compute-1 sudo[237380]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:15:34 compute-1 sudo[237380]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:15:34 compute-1 sudo[237380]: pam_unix(sudo:session): session closed for user root
Jan 26 10:15:34 compute-1 sudo[237405]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Jan 26 10:15:34 compute-1 sudo[237405]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:15:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:15:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:15:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:15:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:15:34 compute-1 ceph-mon[80107]: pgmap v1000: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 64 op/s
Jan 26 10:15:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:15:34 compute-1 nova_compute[226322]: 2026-01-26 10:15:34.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:15:34 compute-1 sudo[237405]: pam_unix(sudo:session): session closed for user root
Jan 26 10:15:34 compute-1 nova_compute[226322]: 2026-01-26 10:15:34.893 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:15:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:35.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:35 compute-1 nova_compute[226322]: 2026-01-26 10:15:35.682 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:15:35 compute-1 nova_compute[226322]: 2026-01-26 10:15:35.682 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:15:35 compute-1 nova_compute[226322]: 2026-01-26 10:15:35.705 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:15:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:15:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:35.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:15:35 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2751436258' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:15:35 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/265833480' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:15:35 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:15:36 compute-1 nova_compute[226322]: 2026-01-26 10:15:36.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:15:36 compute-1 nova_compute[226322]: 2026-01-26 10:15:36.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:15:36 compute-1 nova_compute[226322]: 2026-01-26 10:15:36.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:15:36 compute-1 nova_compute[226322]: 2026-01-26 10:15:36.705 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 10:15:36 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:15:37 compute-1 nova_compute[226322]: 2026-01-26 10:15:37.399 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:15:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:15:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:37.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:15:37 compute-1 nova_compute[226322]: 2026-01-26 10:15:37.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:15:37 compute-1 nova_compute[226322]: 2026-01-26 10:15:37.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:15:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:37.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:37 compute-1 ceph-mon[80107]: pgmap v1001: 353 pgs: 353 active+clean; 113 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 117 op/s
Jan 26 10:15:37 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:15:37 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:15:37 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 26 10:15:37 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:15:37 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 26 10:15:37 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:15:37 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:15:37 compute-1 ceph-mon[80107]: pgmap v1002: 353 pgs: 353 active+clean; 113 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 346 KiB/s rd, 2.5 MiB/s wr, 62 op/s
Jan 26 10:15:37 compute-1 ceph-mon[80107]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Jan 26 10:15:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:15:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:15:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:15:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:15:38 compute-1 nova_compute[226322]: 2026-01-26 10:15:38.688 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:15:38 compute-1 nova_compute[226322]: 2026-01-26 10:15:38.689 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:15:39 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:15:39 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:15:39 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:15:39 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:15:39 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:15:39 compute-1 ceph-mon[80107]: pgmap v1003: 353 pgs: 353 active+clean; 113 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 346 KiB/s rd, 2.5 MiB/s wr, 62 op/s
Jan 26 10:15:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:39.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:39 compute-1 nova_compute[226322]: 2026-01-26 10:15:39.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:15:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:15:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:39.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:15:39 compute-1 nova_compute[226322]: 2026-01-26 10:15:39.855 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:15:39 compute-1 nova_compute[226322]: 2026-01-26 10:15:39.856 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:15:39 compute-1 nova_compute[226322]: 2026-01-26 10:15:39.856 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:15:39 compute-1 nova_compute[226322]: 2026-01-26 10:15:39.856 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:15:39 compute-1 nova_compute[226322]: 2026-01-26 10:15:39.856 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:15:39 compute-1 nova_compute[226322]: 2026-01-26 10:15:39.895 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:15:40 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:15:40 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/552703975' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:15:40 compute-1 nova_compute[226322]: 2026-01-26 10:15:40.532 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.675s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:15:40 compute-1 ceph-mon[80107]: pgmap v1004: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 385 KiB/s rd, 2.5 MiB/s wr, 75 op/s
Jan 26 10:15:40 compute-1 nova_compute[226322]: 2026-01-26 10:15:40.751 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:15:40 compute-1 nova_compute[226322]: 2026-01-26 10:15:40.753 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4837MB free_disk=59.94289016723633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:15:40 compute-1 nova_compute[226322]: 2026-01-26 10:15:40.753 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:15:40 compute-1 nova_compute[226322]: 2026-01-26 10:15:40.754 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:15:40 compute-1 nova_compute[226322]: 2026-01-26 10:15:40.921 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:15:40 compute-1 nova_compute[226322]: 2026-01-26 10:15:40.922 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:15:40 compute-1 nova_compute[226322]: 2026-01-26 10:15:40.958 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:15:41 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:15:41 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3497306287' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:15:41 compute-1 nova_compute[226322]: 2026-01-26 10:15:41.427 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:15:41 compute-1 nova_compute[226322]: 2026-01-26 10:15:41.436 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:15:41 compute-1 nova_compute[226322]: 2026-01-26 10:15:41.467 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:15:41 compute-1 nova_compute[226322]: 2026-01-26 10:15:41.470 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:15:41 compute-1 nova_compute[226322]: 2026-01-26 10:15:41.470 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.717s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:15:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:41.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:41 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:15:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:15:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:41.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:15:42 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/552703975' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:15:42 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3497306287' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:15:42 compute-1 podman[237496]: 2026-01-26 10:15:42.351636897 +0000 UTC m=+0.118254794 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 26 10:15:42 compute-1 nova_compute[226322]: 2026-01-26 10:15:42.401 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:15:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:15:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:15:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:15:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:15:43 compute-1 ceph-mon[80107]: pgmap v1005: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 385 KiB/s rd, 2.5 MiB/s wr, 75 op/s
Jan 26 10:15:43 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1678315008' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:15:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:15:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:43.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:15:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:15:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:43.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:15:44 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1121211806' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:15:44 compute-1 ceph-mon[80107]: pgmap v1006: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 385 KiB/s rd, 2.5 MiB/s wr, 75 op/s
Jan 26 10:15:44 compute-1 nova_compute[226322]: 2026-01-26 10:15:44.896 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:15:44 compute-1 sudo[237524]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:15:44 compute-1 sudo[237524]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:15:44 compute-1 sudo[237524]: pam_unix(sudo:session): session closed for user root
Jan 26 10:15:44 compute-1 sudo[237547]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:15:44 compute-1 sudo[237547]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:15:44 compute-1 sudo[237547]: pam_unix(sudo:session): session closed for user root
Jan 26 10:15:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:15:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:45.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:15:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:15:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:45.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:15:45 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:15:45 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:15:46 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:15:46 compute-1 ceph-mon[80107]: pgmap v1007: 353 pgs: 353 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 66 KiB/s rd, 38 KiB/s wr, 47 op/s
Jan 26 10:15:46 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3004217025' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:15:47 compute-1 nova_compute[226322]: 2026-01-26 10:15:47.449 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:15:47 compute-1 sshd-session[237575]: Invalid user admin from 178.62.249.31 port 40978
Jan 26 10:15:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:15:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:47.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:15:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:47.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:47 compute-1 sshd-session[237575]: Connection closed by invalid user admin 178.62.249.31 port 40978 [preauth]
Jan 26 10:15:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:15:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:15:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:15:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:15:48 compute-1 ceph-mon[80107]: pgmap v1008: 353 pgs: 353 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 56 KiB/s rd, 32 KiB/s wr, 40 op/s
Jan 26 10:15:49 compute-1 sshd-session[237578]: Invalid user admin from 152.42.136.110 port 58580
Jan 26 10:15:49 compute-1 sshd-session[237578]: Connection closed by invalid user admin 152.42.136.110 port 58580 [preauth]
Jan 26 10:15:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:49.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:15:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:49.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:15:49 compute-1 nova_compute[226322]: 2026-01-26 10:15:49.899 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:15:50 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:15:50 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:15:51 compute-1 ceph-mon[80107]: pgmap v1009: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 57 KiB/s rd, 32 KiB/s wr, 41 op/s
Jan 26 10:15:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:15:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:51.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:15:51 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:15:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:51.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:52 compute-1 ceph-mon[80107]: pgmap v1010: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 23 KiB/s rd, 14 KiB/s wr, 29 op/s
Jan 26 10:15:52 compute-1 nova_compute[226322]: 2026-01-26 10:15:52.455 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:15:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:15:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:15:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:15:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:15:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:15:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:53.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:15:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:15:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:53.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:15:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:15:53.939 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:15:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:15:53.944 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:15:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:15:53.945 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:15:54 compute-1 nova_compute[226322]: 2026-01-26 10:15:54.949 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:15:55 compute-1 ceph-mon[80107]: pgmap v1011: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 23 KiB/s rd, 14 KiB/s wr, 29 op/s
Jan 26 10:15:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:15:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:55.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:15:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:15:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:55.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:15:56 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:15:57 compute-1 ceph-mon[80107]: pgmap v1012: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 23 KiB/s rd, 14 KiB/s wr, 30 op/s
Jan 26 10:15:57 compute-1 podman[237584]: 2026-01-26 10:15:57.302194206 +0000 UTC m=+0.074428295 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 10:15:57 compute-1 nova_compute[226322]: 2026-01-26 10:15:57.492 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:15:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:57.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:57.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:15:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:15:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:15:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:15:58 compute-1 ceph-mon[80107]: pgmap v1013: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 0 B/s wr, 1 op/s
Jan 26 10:15:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/1968492371' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:15:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/1968492371' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:15:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:15:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:59.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:15:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:15:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:15:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:59.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:15:59 compute-1 nova_compute[226322]: 2026-01-26 10:15:59.988 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:00 compute-1 ceph-mon[80107]: pgmap v1014: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 0 B/s wr, 1 op/s
Jan 26 10:16:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:01.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:01 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:16:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:01.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:02 compute-1 nova_compute[226322]: 2026-01-26 10:16:02.493 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:16:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:16:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:16:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:16:03 compute-1 ceph-mon[80107]: pgmap v1015: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:16:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:03.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:03.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:16:04 compute-1 nova_compute[226322]: 2026-01-26 10:16:04.990 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:05 compute-1 ceph-mon[80107]: pgmap v1016: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:16:05 compute-1 sudo[237607]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:16:05 compute-1 sudo[237607]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:16:05 compute-1 sudo[237607]: pam_unix(sudo:session): session closed for user root
Jan 26 10:16:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:05.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:05.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:06 compute-1 ceph-mon[80107]: pgmap v1017: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:16:06 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:16:07 compute-1 nova_compute[226322]: 2026-01-26 10:16:07.545 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:07 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/603851357' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:16:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000054s ======
Jan 26 10:16:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:07.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 26 10:16:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:16:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:07.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:16:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:16:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:16:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:16:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:16:08 compute-1 ceph-mon[80107]: pgmap v1018: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:16:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:09.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:09.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:09 compute-1 nova_compute[226322]: 2026-01-26 10:16:09.991 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:11 compute-1 ceph-mon[80107]: pgmap v1019: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Jan 26 10:16:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:11.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:11 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:16:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:11.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:12 compute-1 nova_compute[226322]: 2026-01-26 10:16:12.550 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:12 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 26 10:16:12 compute-1 podman[237637]: 2026-01-26 10:16:12.914582369 +0000 UTC m=+0.110295707 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 26 10:16:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:16:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:16:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:16:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:16:13 compute-1 ceph-mon[80107]: pgmap v1020: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Jan 26 10:16:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:13.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:13.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:14 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/448725082' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:16:14 compute-1 ceph-mon[80107]: pgmap v1021: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Jan 26 10:16:14 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3197192949' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:16:14 compute-1 nova_compute[226322]: 2026-01-26 10:16:14.994 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:15.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:15.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:16 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:16:17 compute-1 ceph-mon[80107]: pgmap v1022: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Jan 26 10:16:17 compute-1 nova_compute[226322]: 2026-01-26 10:16:17.552 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:17.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:17.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:16:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:16:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:16:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:16:18 compute-1 sshd-session[237665]: Invalid user centos from 104.248.198.71 port 56698
Jan 26 10:16:18 compute-1 sshd-session[237665]: Connection closed by invalid user centos 104.248.198.71 port 56698 [preauth]
Jan 26 10:16:19 compute-1 ceph-mon[80107]: pgmap v1023: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Jan 26 10:16:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:16:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:19.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:16:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:19.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:16:19 compute-1 nova_compute[226322]: 2026-01-26 10:16:19.996 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:20 compute-1 ceph-mon[80107]: pgmap v1024: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 102 op/s
Jan 26 10:16:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:21.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:21 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:16:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:21.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:22 compute-1 nova_compute[226322]: 2026-01-26 10:16:22.553 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:16:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:16:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:16:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:16:23 compute-1 ceph-mon[80107]: pgmap v1025: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 26 10:16:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:23.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:16:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:23.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:16:24 compute-1 nova_compute[226322]: 2026-01-26 10:16:24.997 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:25 compute-1 ceph-mon[80107]: pgmap v1026: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 26 10:16:25 compute-1 sudo[237670]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:16:25 compute-1 sudo[237670]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:16:25 compute-1 sudo[237670]: pam_unix(sudo:session): session closed for user root
Jan 26 10:16:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:25.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:25.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:26 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:16:27 compute-1 ceph-mon[80107]: pgmap v1027: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 75 op/s
Jan 26 10:16:27 compute-1 nova_compute[226322]: 2026-01-26 10:16:27.556 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:16:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:27.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:16:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:16:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:27.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:16:27 compute-1 nova_compute[226322]: 2026-01-26 10:16:27.866 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "b040151a-46d9-4685-84c4-316c2d7feedb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:16:27 compute-1 nova_compute[226322]: 2026-01-26 10:16:27.866 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:16:27 compute-1 nova_compute[226322]: 2026-01-26 10:16:27.881 226326 DEBUG nova.compute.manager [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 10:16:27 compute-1 nova_compute[226322]: 2026-01-26 10:16:27.948 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:16:27 compute-1 nova_compute[226322]: 2026-01-26 10:16:27.949 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:16:27 compute-1 nova_compute[226322]: 2026-01-26 10:16:27.957 226326 DEBUG nova.virt.hardware [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 10:16:27 compute-1 nova_compute[226322]: 2026-01-26 10:16:27.958 226326 INFO nova.compute.claims [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Claim successful on node compute-1.ctlplane.example.com
Jan 26 10:16:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:16:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:16:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:16:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:16:28 compute-1 nova_compute[226322]: 2026-01-26 10:16:28.065 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:16:28 compute-1 podman[237717]: 2026-01-26 10:16:28.275367724 +0000 UTC m=+0.054182343 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Jan 26 10:16:28 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:16:28 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2689113532' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:16:28 compute-1 nova_compute[226322]: 2026-01-26 10:16:28.502 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:16:28 compute-1 nova_compute[226322]: 2026-01-26 10:16:28.509 226326 DEBUG nova.compute.provider_tree [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:16:28 compute-1 nova_compute[226322]: 2026-01-26 10:16:28.526 226326 DEBUG nova.scheduler.client.report [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:16:28 compute-1 nova_compute[226322]: 2026-01-26 10:16:28.546 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.597s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:16:28 compute-1 nova_compute[226322]: 2026-01-26 10:16:28.547 226326 DEBUG nova.compute.manager [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 10:16:28 compute-1 nova_compute[226322]: 2026-01-26 10:16:28.603 226326 DEBUG nova.compute.manager [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 10:16:28 compute-1 nova_compute[226322]: 2026-01-26 10:16:28.603 226326 DEBUG nova.network.neutron [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 10:16:28 compute-1 nova_compute[226322]: 2026-01-26 10:16:28.638 226326 INFO nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 10:16:28 compute-1 nova_compute[226322]: 2026-01-26 10:16:28.663 226326 DEBUG nova.compute.manager [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 10:16:28 compute-1 nova_compute[226322]: 2026-01-26 10:16:28.771 226326 DEBUG nova.compute.manager [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 10:16:28 compute-1 nova_compute[226322]: 2026-01-26 10:16:28.772 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 10:16:28 compute-1 nova_compute[226322]: 2026-01-26 10:16:28.772 226326 INFO nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Creating image(s)
Jan 26 10:16:28 compute-1 nova_compute[226322]: 2026-01-26 10:16:28.796 226326 DEBUG nova.storage.rbd_utils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image b040151a-46d9-4685-84c4-316c2d7feedb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:16:28 compute-1 nova_compute[226322]: 2026-01-26 10:16:28.822 226326 DEBUG nova.storage.rbd_utils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image b040151a-46d9-4685-84c4-316c2d7feedb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:16:28 compute-1 nova_compute[226322]: 2026-01-26 10:16:28.845 226326 DEBUG nova.storage.rbd_utils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image b040151a-46d9-4685-84c4-316c2d7feedb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:16:28 compute-1 nova_compute[226322]: 2026-01-26 10:16:28.849 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:16:28 compute-1 nova_compute[226322]: 2026-01-26 10:16:28.905 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:16:28 compute-1 nova_compute[226322]: 2026-01-26 10:16:28.906 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "d81880e926e175d0cc7241caa7cc18231a8a289c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:16:28 compute-1 nova_compute[226322]: 2026-01-26 10:16:28.906 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "d81880e926e175d0cc7241caa7cc18231a8a289c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:16:28 compute-1 nova_compute[226322]: 2026-01-26 10:16:28.907 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "d81880e926e175d0cc7241caa7cc18231a8a289c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:16:28 compute-1 nova_compute[226322]: 2026-01-26 10:16:28.933 226326 DEBUG nova.storage.rbd_utils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image b040151a-46d9-4685-84c4-316c2d7feedb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:16:28 compute-1 nova_compute[226322]: 2026-01-26 10:16:28.936 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c b040151a-46d9-4685-84c4-316c2d7feedb_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:16:29 compute-1 ceph-mon[80107]: pgmap v1028: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 26 10:16:29 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2689113532' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:16:29 compute-1 nova_compute[226322]: 2026-01-26 10:16:29.177 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c b040151a-46d9-4685-84c4-316c2d7feedb_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.241s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:16:29 compute-1 nova_compute[226322]: 2026-01-26 10:16:29.234 226326 DEBUG nova.storage.rbd_utils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] resizing rbd image b040151a-46d9-4685-84c4-316c2d7feedb_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Jan 26 10:16:29 compute-1 nova_compute[226322]: 2026-01-26 10:16:29.264 226326 DEBUG nova.policy [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c1208d3e25b940ea93fe76884c7a53db', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 10:16:29 compute-1 nova_compute[226322]: 2026-01-26 10:16:29.313 226326 DEBUG nova.objects.instance [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'migration_context' on Instance uuid b040151a-46d9-4685-84c4-316c2d7feedb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 10:16:29 compute-1 nova_compute[226322]: 2026-01-26 10:16:29.329 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 26 10:16:29 compute-1 nova_compute[226322]: 2026-01-26 10:16:29.330 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Ensure instance console log exists: /var/lib/nova/instances/b040151a-46d9-4685-84c4-316c2d7feedb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 26 10:16:29 compute-1 nova_compute[226322]: 2026-01-26 10:16:29.331 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:16:29 compute-1 nova_compute[226322]: 2026-01-26 10:16:29.331 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:16:29 compute-1 nova_compute[226322]: 2026-01-26 10:16:29.332 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:16:29 compute-1 nova_compute[226322]: 2026-01-26 10:16:29.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:16:29 compute-1 nova_compute[226322]: 2026-01-26 10:16:29.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 26 10:16:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:29.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:16:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:29.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:16:29 compute-1 nova_compute[226322]: 2026-01-26 10:16:29.999 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:30 compute-1 ceph-mon[80107]: pgmap v1029: 353 pgs: 353 active+clean; 155 MiB data, 316 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 3.8 MiB/s wr, 146 op/s
Jan 26 10:16:30 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:30.481 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 10:16:30 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:30.482 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 10:16:30 compute-1 nova_compute[226322]: 2026-01-26 10:16:30.482 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:30 compute-1 nova_compute[226322]: 2026-01-26 10:16:30.589 226326 DEBUG nova.network.neutron [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Successfully created port: e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 10:16:31 compute-1 sshd-session[237906]: Invalid user admin from 152.42.136.110 port 42752
Jan 26 10:16:31 compute-1 sshd-session[237906]: Connection closed by invalid user admin 152.42.136.110 port 42752 [preauth]
Jan 26 10:16:31 compute-1 nova_compute[226322]: 2026-01-26 10:16:31.521 226326 DEBUG nova.network.neutron [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Successfully updated port: e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 10:16:31 compute-1 nova_compute[226322]: 2026-01-26 10:16:31.539 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 10:16:31 compute-1 nova_compute[226322]: 2026-01-26 10:16:31.539 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquired lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 10:16:31 compute-1 nova_compute[226322]: 2026-01-26 10:16:31.539 226326 DEBUG nova.network.neutron [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 10:16:31 compute-1 sshd-session[237908]: Invalid user admin from 178.62.249.31 port 38076
Jan 26 10:16:31 compute-1 nova_compute[226322]: 2026-01-26 10:16:31.654 226326 DEBUG nova.compute.manager [req-21652baa-e1ad-4586-8489-7f8a7af32e45 req-439277ee-8eb6-4654-8f70-9ea8cf0a7a6c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Received event network-changed-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:16:31 compute-1 nova_compute[226322]: 2026-01-26 10:16:31.655 226326 DEBUG nova.compute.manager [req-21652baa-e1ad-4586-8489-7f8a7af32e45 req-439277ee-8eb6-4654-8f70-9ea8cf0a7a6c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Refreshing instance network info cache due to event network-changed-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 10:16:31 compute-1 nova_compute[226322]: 2026-01-26 10:16:31.655 226326 DEBUG oslo_concurrency.lockutils [req-21652baa-e1ad-4586-8489-7f8a7af32e45 req-439277ee-8eb6-4654-8f70-9ea8cf0a7a6c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 10:16:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:31.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:31 compute-1 nova_compute[226322]: 2026-01-26 10:16:31.746 226326 DEBUG nova.network.neutron [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 10:16:31 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:16:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:16:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:31.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:16:31 compute-1 sshd-session[237908]: Connection closed by invalid user admin 178.62.249.31 port 38076 [preauth]
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.558 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.756 226326 DEBUG nova.network.neutron [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Updating instance_info_cache with network_info: [{"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.845 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Releasing lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.846 226326 DEBUG nova.compute.manager [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Instance network_info: |[{"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.846 226326 DEBUG oslo_concurrency.lockutils [req-21652baa-e1ad-4586-8489-7f8a7af32e45 req-439277ee-8eb6-4654-8f70-9ea8cf0a7a6c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquired lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.846 226326 DEBUG nova.network.neutron [req-21652baa-e1ad-4586-8489-7f8a7af32e45 req-439277ee-8eb6-4654-8f70-9ea8cf0a7a6c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Refreshing network info cache for port e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.849 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Start _get_guest_xml network_info=[{"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T10:05:39Z,direct_url=<?>,disk_format='qcow2',id=6789692f-fc1f-4efa-ae75-dcc13be695ef,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3ff3fa2a5531460b993c609589aa545d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T10:05:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'size': 0, 'image_id': '6789692f-fc1f-4efa-ae75-dcc13be695ef'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.854 226326 WARNING nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.859 226326 DEBUG nova.virt.libvirt.host [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.860 226326 DEBUG nova.virt.libvirt.host [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.863 226326 DEBUG nova.virt.libvirt.host [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.864 226326 DEBUG nova.virt.libvirt.host [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.864 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.865 226326 DEBUG nova.virt.hardware [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T10:05:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='57e1601b-dbfa-4d3b-8b96-27302e4a7a06',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T10:05:39Z,direct_url=<?>,disk_format='qcow2',id=6789692f-fc1f-4efa-ae75-dcc13be695ef,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3ff3fa2a5531460b993c609589aa545d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T10:05:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.865 226326 DEBUG nova.virt.hardware [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.866 226326 DEBUG nova.virt.hardware [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.866 226326 DEBUG nova.virt.hardware [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.866 226326 DEBUG nova.virt.hardware [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.866 226326 DEBUG nova.virt.hardware [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.867 226326 DEBUG nova.virt.hardware [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.867 226326 DEBUG nova.virt.hardware [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.867 226326 DEBUG nova.virt.hardware [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.867 226326 DEBUG nova.virt.hardware [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.868 226326 DEBUG nova.virt.hardware [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.871 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:16:32 compute-1 nova_compute[226322]: 2026-01-26 10:16:32.893 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:16:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:16:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:16:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:16:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:16:33 compute-1 ceph-mon[80107]: pgmap v1030: 353 pgs: 353 active+clean; 155 MiB data, 316 MiB used, 60 GiB / 60 GiB avail; 251 KiB/s rd, 3.8 MiB/s wr, 71 op/s
Jan 26 10:16:33 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 26 10:16:33 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1830823020' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.370 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.397 226326 DEBUG nova.storage.rbd_utils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image b040151a-46d9-4685-84c4-316c2d7feedb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.401 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:16:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:33.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:33 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 26 10:16:33 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3710592775' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:16:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:33.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.836 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.837 226326 DEBUG nova.virt.libvirt.vif [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T10:16:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1174728540',display_name='tempest-TestNetworkBasicOps-server-1174728540',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1174728540',id=12,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB+EGw3c7uXEi4OTNwulYSgVH+l4n13lnOEqRefy0kW9/M5qFO/QglqfVEv83QxC3p6/WVWNJKRmKF5HdJHv8FbQly+oOPz0zT0wKKx+uweCnuuDtsTED/V9sqhB977UJw==',key_name='tempest-TestNetworkBasicOps-1514805345',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-hl8103a0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T10:16:28Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=b040151a-46d9-4685-84c4-316c2d7feedb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.838 226326 DEBUG nova.network.os_vif_util [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.839 226326 DEBUG nova.network.os_vif_util [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:1d:9d,bridge_name='br-int',has_traffic_filtering=True,id=e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae,network=Network(9bff64e0-694f-4b2d-b4b5-5e3b1d94460e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape8f0e0cf-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.840 226326 DEBUG nova.objects.instance [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'pci_devices' on Instance uuid b040151a-46d9-4685-84c4-316c2d7feedb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.859 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] End _get_guest_xml xml=<domain type="kvm">
Jan 26 10:16:33 compute-1 nova_compute[226322]:   <uuid>b040151a-46d9-4685-84c4-316c2d7feedb</uuid>
Jan 26 10:16:33 compute-1 nova_compute[226322]:   <name>instance-0000000c</name>
Jan 26 10:16:33 compute-1 nova_compute[226322]:   <memory>131072</memory>
Jan 26 10:16:33 compute-1 nova_compute[226322]:   <vcpu>1</vcpu>
Jan 26 10:16:33 compute-1 nova_compute[226322]:   <metadata>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <nova:name>tempest-TestNetworkBasicOps-server-1174728540</nova:name>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <nova:creationTime>2026-01-26 10:16:32</nova:creationTime>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <nova:flavor name="m1.nano">
Jan 26 10:16:33 compute-1 nova_compute[226322]:         <nova:memory>128</nova:memory>
Jan 26 10:16:33 compute-1 nova_compute[226322]:         <nova:disk>1</nova:disk>
Jan 26 10:16:33 compute-1 nova_compute[226322]:         <nova:swap>0</nova:swap>
Jan 26 10:16:33 compute-1 nova_compute[226322]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 10:16:33 compute-1 nova_compute[226322]:         <nova:vcpus>1</nova:vcpus>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       </nova:flavor>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <nova:owner>
Jan 26 10:16:33 compute-1 nova_compute[226322]:         <nova:user uuid="c1208d3e25b940ea93fe76884c7a53db">tempest-TestNetworkBasicOps-966559857-project-member</nova:user>
Jan 26 10:16:33 compute-1 nova_compute[226322]:         <nova:project uuid="6ed221b375a44fc2bb2a8f232c5446e7">tempest-TestNetworkBasicOps-966559857</nova:project>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       </nova:owner>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <nova:root type="image" uuid="6789692f-fc1f-4efa-ae75-dcc13be695ef"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <nova:ports>
Jan 26 10:16:33 compute-1 nova_compute[226322]:         <nova:port uuid="e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae">
Jan 26 10:16:33 compute-1 nova_compute[226322]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:         </nova:port>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       </nova:ports>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     </nova:instance>
Jan 26 10:16:33 compute-1 nova_compute[226322]:   </metadata>
Jan 26 10:16:33 compute-1 nova_compute[226322]:   <sysinfo type="smbios">
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <system>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <entry name="manufacturer">RDO</entry>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <entry name="product">OpenStack Compute</entry>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <entry name="serial">b040151a-46d9-4685-84c4-316c2d7feedb</entry>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <entry name="uuid">b040151a-46d9-4685-84c4-316c2d7feedb</entry>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <entry name="family">Virtual Machine</entry>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     </system>
Jan 26 10:16:33 compute-1 nova_compute[226322]:   </sysinfo>
Jan 26 10:16:33 compute-1 nova_compute[226322]:   <os>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <boot dev="hd"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <smbios mode="sysinfo"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:   </os>
Jan 26 10:16:33 compute-1 nova_compute[226322]:   <features>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <acpi/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <apic/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <vmcoreinfo/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:   </features>
Jan 26 10:16:33 compute-1 nova_compute[226322]:   <clock offset="utc">
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <timer name="hpet" present="no"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:   </clock>
Jan 26 10:16:33 compute-1 nova_compute[226322]:   <cpu mode="host-model" match="exact">
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:   </cpu>
Jan 26 10:16:33 compute-1 nova_compute[226322]:   <devices>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <disk type="network" device="disk">
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <driver type="raw" cache="none"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <source protocol="rbd" name="vms/b040151a-46d9-4685-84c4-316c2d7feedb_disk">
Jan 26 10:16:33 compute-1 nova_compute[226322]:         <host name="192.168.122.100" port="6789"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:         <host name="192.168.122.102" port="6789"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:         <host name="192.168.122.101" port="6789"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       </source>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <auth username="openstack">
Jan 26 10:16:33 compute-1 nova_compute[226322]:         <secret type="ceph" uuid="1a70b85d-e3fd-5814-8a6a-37ea00fcae30"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       </auth>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <target dev="vda" bus="virtio"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     </disk>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <disk type="network" device="cdrom">
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <driver type="raw" cache="none"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <source protocol="rbd" name="vms/b040151a-46d9-4685-84c4-316c2d7feedb_disk.config">
Jan 26 10:16:33 compute-1 nova_compute[226322]:         <host name="192.168.122.100" port="6789"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:         <host name="192.168.122.102" port="6789"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:         <host name="192.168.122.101" port="6789"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       </source>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <auth username="openstack">
Jan 26 10:16:33 compute-1 nova_compute[226322]:         <secret type="ceph" uuid="1a70b85d-e3fd-5814-8a6a-37ea00fcae30"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       </auth>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <target dev="sda" bus="sata"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     </disk>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <interface type="ethernet">
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <mac address="fa:16:3e:97:1d:9d"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <model type="virtio"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <mtu size="1442"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <target dev="tape8f0e0cf-36"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     </interface>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <serial type="pty">
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <log file="/var/lib/nova/instances/b040151a-46d9-4685-84c4-316c2d7feedb/console.log" append="off"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     </serial>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <video>
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <model type="virtio"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     </video>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <input type="tablet" bus="usb"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <rng model="virtio">
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <backend model="random">/dev/urandom</backend>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     </rng>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <controller type="usb" index="0"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     <memballoon model="virtio">
Jan 26 10:16:33 compute-1 nova_compute[226322]:       <stats period="10"/>
Jan 26 10:16:33 compute-1 nova_compute[226322]:     </memballoon>
Jan 26 10:16:33 compute-1 nova_compute[226322]:   </devices>
Jan 26 10:16:33 compute-1 nova_compute[226322]: </domain>
Jan 26 10:16:33 compute-1 nova_compute[226322]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.861 226326 DEBUG nova.compute.manager [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Preparing to wait for external event network-vif-plugged-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.861 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.861 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.861 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.862 226326 DEBUG nova.virt.libvirt.vif [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T10:16:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1174728540',display_name='tempest-TestNetworkBasicOps-server-1174728540',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1174728540',id=12,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB+EGw3c7uXEi4OTNwulYSgVH+l4n13lnOEqRefy0kW9/M5qFO/QglqfVEv83QxC3p6/WVWNJKRmKF5HdJHv8FbQly+oOPz0zT0wKKx+uweCnuuDtsTED/V9sqhB977UJw==',key_name='tempest-TestNetworkBasicOps-1514805345',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-hl8103a0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T10:16:28Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=b040151a-46d9-4685-84c4-316c2d7feedb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.862 226326 DEBUG nova.network.os_vif_util [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.863 226326 DEBUG nova.network.os_vif_util [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:1d:9d,bridge_name='br-int',has_traffic_filtering=True,id=e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae,network=Network(9bff64e0-694f-4b2d-b4b5-5e3b1d94460e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape8f0e0cf-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.863 226326 DEBUG os_vif [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:1d:9d,bridge_name='br-int',has_traffic_filtering=True,id=e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae,network=Network(9bff64e0-694f-4b2d-b4b5-5e3b1d94460e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape8f0e0cf-36') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.864 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.864 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.865 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.867 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.868 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape8f0e0cf-36, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.869 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape8f0e0cf-36, col_values=(('external_ids', {'iface-id': 'e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:97:1d:9d', 'vm-uuid': 'b040151a-46d9-4685-84c4-316c2d7feedb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.870 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:33 compute-1 NetworkManager[49073]: <info>  [1769422593.8714] manager: (tape8f0e0cf-36): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.873 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.877 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.878 226326 INFO os_vif [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:1d:9d,bridge_name='br-int',has_traffic_filtering=True,id=e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae,network=Network(9bff64e0-694f-4b2d-b4b5-5e3b1d94460e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape8f0e0cf-36')
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.920 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.920 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.920 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No VIF found with MAC fa:16:3e:97:1d:9d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.921 226326 INFO nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Using config drive
Jan 26 10:16:33 compute-1 nova_compute[226322]: 2026-01-26 10:16:33.944 226326 DEBUG nova.storage.rbd_utils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image b040151a-46d9-4685-84c4-316c2d7feedb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:16:34 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1830823020' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:16:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:16:34 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3710592775' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:16:34 compute-1 ceph-mon[80107]: pgmap v1031: 353 pgs: 353 active+clean; 155 MiB data, 316 MiB used, 60 GiB / 60 GiB avail; 251 KiB/s rd, 3.8 MiB/s wr, 71 op/s
Jan 26 10:16:34 compute-1 nova_compute[226322]: 2026-01-26 10:16:34.229 226326 DEBUG nova.network.neutron [req-21652baa-e1ad-4586-8489-7f8a7af32e45 req-439277ee-8eb6-4654-8f70-9ea8cf0a7a6c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Updated VIF entry in instance network info cache for port e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 10:16:34 compute-1 nova_compute[226322]: 2026-01-26 10:16:34.230 226326 DEBUG nova.network.neutron [req-21652baa-e1ad-4586-8489-7f8a7af32e45 req-439277ee-8eb6-4654-8f70-9ea8cf0a7a6c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Updating instance_info_cache with network_info: [{"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:16:34 compute-1 nova_compute[226322]: 2026-01-26 10:16:34.252 226326 DEBUG oslo_concurrency.lockutils [req-21652baa-e1ad-4586-8489-7f8a7af32e45 req-439277ee-8eb6-4654-8f70-9ea8cf0a7a6c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Releasing lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 10:16:34 compute-1 nova_compute[226322]: 2026-01-26 10:16:34.337 226326 INFO nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Creating config drive at /var/lib/nova/instances/b040151a-46d9-4685-84c4-316c2d7feedb/disk.config
Jan 26 10:16:34 compute-1 nova_compute[226322]: 2026-01-26 10:16:34.342 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b040151a-46d9-4685-84c4-316c2d7feedb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_mg2qu9r execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:16:34 compute-1 nova_compute[226322]: 2026-01-26 10:16:34.467 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b040151a-46d9-4685-84c4-316c2d7feedb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_mg2qu9r" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:16:34 compute-1 nova_compute[226322]: 2026-01-26 10:16:34.497 226326 DEBUG nova.storage.rbd_utils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image b040151a-46d9-4685-84c4-316c2d7feedb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 26 10:16:34 compute-1 nova_compute[226322]: 2026-01-26 10:16:34.500 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b040151a-46d9-4685-84c4-316c2d7feedb/disk.config b040151a-46d9-4685-84c4-316c2d7feedb_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:16:34 compute-1 nova_compute[226322]: 2026-01-26 10:16:34.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:16:34 compute-1 nova_compute[226322]: 2026-01-26 10:16:34.689 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 26 10:16:34 compute-1 nova_compute[226322]: 2026-01-26 10:16:34.725 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 26 10:16:34 compute-1 nova_compute[226322]: 2026-01-26 10:16:34.777 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b040151a-46d9-4685-84c4-316c2d7feedb/disk.config b040151a-46d9-4685-84c4-316c2d7feedb_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.277s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:16:34 compute-1 nova_compute[226322]: 2026-01-26 10:16:34.778 226326 INFO nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Deleting local config drive /var/lib/nova/instances/b040151a-46d9-4685-84c4-316c2d7feedb/disk.config because it was imported into RBD.
Jan 26 10:16:34 compute-1 systemd[1]: Starting libvirt secret daemon...
Jan 26 10:16:34 compute-1 systemd[1]: Started libvirt secret daemon.
Jan 26 10:16:34 compute-1 kernel: tape8f0e0cf-36: entered promiscuous mode
Jan 26 10:16:34 compute-1 NetworkManager[49073]: <info>  [1769422594.8638] manager: (tape8f0e0cf-36): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Jan 26 10:16:34 compute-1 ovn_controller[133670]: 2026-01-26T10:16:34Z|00067|binding|INFO|Claiming lport e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae for this chassis.
Jan 26 10:16:34 compute-1 nova_compute[226322]: 2026-01-26 10:16:34.863 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:34 compute-1 ovn_controller[133670]: 2026-01-26T10:16:34Z|00068|binding|INFO|e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae: Claiming fa:16:3e:97:1d:9d 10.100.0.12
Jan 26 10:16:34 compute-1 nova_compute[226322]: 2026-01-26 10:16:34.868 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:34 compute-1 nova_compute[226322]: 2026-01-26 10:16:34.870 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:34 compute-1 NetworkManager[49073]: <info>  [1769422594.8740] manager: (patch-br-int-to-provnet-94d9950f-5cf2-4813-9455-dd14377245f4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Jan 26 10:16:34 compute-1 nova_compute[226322]: 2026-01-26 10:16:34.872 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:34 compute-1 NetworkManager[49073]: <info>  [1769422594.8771] manager: (patch-provnet-94d9950f-5cf2-4813-9455-dd14377245f4-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Jan 26 10:16:34 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.888 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:1d:9d 10.100.0.12'], port_security=['fa:16:3e:97:1d:9d 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'b040151a-46d9-4685-84c4-316c2d7feedb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5cce9ca3-082e-4a27-8023-5db6a50012d5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db1eda71-392f-4d4b-8724-78530674037e, chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], logical_port=e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 10:16:34 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.889 143326 INFO neutron.agent.ovn.metadata.agent [-] Port e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae in datapath 9bff64e0-694f-4b2d-b4b5-5e3b1d94460e bound to our chassis
Jan 26 10:16:34 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.891 143326 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9bff64e0-694f-4b2d-b4b5-5e3b1d94460e
Jan 26 10:16:34 compute-1 systemd-machined[194876]: New machine qemu-5-instance-0000000c.
Jan 26 10:16:34 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.901 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[978aa8cd-3d1d-410a-898f-1e45e9f8448e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:16:34 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.902 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9bff64e0-61 in ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 26 10:16:34 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.904 229912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9bff64e0-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 26 10:16:34 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.904 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[af69d122-a6fa-457f-b88c-b7798049706e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:16:34 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.904 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[a5580ffd-bcee-433e-9fad-011901dddcc0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:16:34 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.915 143615 DEBUG oslo.privsep.daemon [-] privsep: reply[baad70b1-26f6-415b-8aad-184e96b07707]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:16:34 compute-1 systemd[1]: Started Virtual Machine qemu-5-instance-0000000c.
Jan 26 10:16:34 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.939 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[e821238e-bcff-4d7a-af47-f75acf9d48a2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:16:34 compute-1 systemd-udevd[238068]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 10:16:34 compute-1 nova_compute[226322]: 2026-01-26 10:16:34.947 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:34 compute-1 NetworkManager[49073]: <info>  [1769422594.9531] device (tape8f0e0cf-36): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 10:16:34 compute-1 NetworkManager[49073]: <info>  [1769422594.9546] device (tape8f0e0cf-36): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 10:16:34 compute-1 nova_compute[226322]: 2026-01-26 10:16:34.956 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:34 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.965 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[9e9e68db-2735-4ba9-8b3b-2f88f20fdad6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:16:34 compute-1 ovn_controller[133670]: 2026-01-26T10:16:34Z|00069|binding|INFO|Setting lport e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae ovn-installed in OVS
Jan 26 10:16:34 compute-1 ovn_controller[133670]: 2026-01-26T10:16:34Z|00070|binding|INFO|Setting lport e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae up in Southbound
Jan 26 10:16:34 compute-1 nova_compute[226322]: 2026-01-26 10:16:34.968 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:34 compute-1 systemd-udevd[238073]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 10:16:34 compute-1 NetworkManager[49073]: <info>  [1769422594.9717] manager: (tap9bff64e0-60): new Veth device (/org/freedesktop/NetworkManager/Devices/51)
Jan 26 10:16:34 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.970 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[a7a78a08-b02f-4c9e-8333-2ab2e889f0ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:16:34 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.994 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[501988c9-7219-404b-9b44-ee73879df3ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:16:34 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.996 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[ae0da81f-271c-4b1b-990d-044237bdcd51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.001 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:35 compute-1 NetworkManager[49073]: <info>  [1769422595.0153] device (tap9bff64e0-60): carrier: link connected
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.019 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[cc7ea17c-f705-49a0-99d7-19fb7e193135]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.033 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[0e7103e3-c982-4a55-a830-1875a6bf0706]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9bff64e0-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:82:7f:67'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455558, 'reachable_time': 18566, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238098, 'error': None, 'target': 'ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.047 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[1288446b-80eb-4344-a9e7-e759dc3cc7d0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe82:7f67'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 455558, 'tstamp': 455558}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238099, 'error': None, 'target': 'ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.063 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[749b439e-1b97-4e2b-b96a-4e5657f7d643]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9bff64e0-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:82:7f:67'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455558, 'reachable_time': 18566, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238100, 'error': None, 'target': 'ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.087 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[1e1a46c1-2828-4c76-9d39-4647c47b244c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.137 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[1e1f8a40-e61c-422e-a3eb-97bc251f1344]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.138 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9bff64e0-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.138 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.139 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9bff64e0-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.175 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:35 compute-1 NetworkManager[49073]: <info>  [1769422595.1760] manager: (tap9bff64e0-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Jan 26 10:16:35 compute-1 kernel: tap9bff64e0-60: entered promiscuous mode
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.178 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9bff64e0-60, col_values=(('external_ids', {'iface-id': '58fa2dc8-9a67-4ebd-8c74-a3dee5be3d64'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.179 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:35 compute-1 ovn_controller[133670]: 2026-01-26T10:16:35Z|00071|binding|INFO|Releasing lport 58fa2dc8-9a67-4ebd-8c74-a3dee5be3d64 from this chassis (sb_readonly=0)
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.193 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.194 143326 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9bff64e0-694f-4b2d-b4b5-5e3b1d94460e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9bff64e0-694f-4b2d-b4b5-5e3b1d94460e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.194 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[e1896926-c363-455d-8e96-e3b281e2aeba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.195 143326 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]: global
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]:     log         /dev/log local0 debug
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]:     log-tag     haproxy-metadata-proxy-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]:     user        root
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]:     group       root
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]:     maxconn     1024
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]:     pidfile     /var/lib/neutron/external/pids/9bff64e0-694f-4b2d-b4b5-5e3b1d94460e.pid.haproxy
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]:     daemon
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]: 
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]: defaults
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]:     log global
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]:     mode http
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]:     option httplog
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]:     option dontlognull
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]:     option http-server-close
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]:     option forwardfor
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]:     retries                 3
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]:     timeout http-request    30s
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]:     timeout connect         30s
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]:     timeout client          32s
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]:     timeout server          32s
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]:     timeout http-keep-alive 30s
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]: 
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]: 
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]: listen listener
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]:     bind 169.254.169.254:80
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]:     http-request add-header X-OVN-Network-ID 9bff64e0-694f-4b2d-b4b5-5e3b1d94460e
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.196 143326 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e', 'env', 'PROCESS_TAG=haproxy-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9bff64e0-694f-4b2d-b4b5-5e3b1d94460e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.437 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422595.4367375, b040151a-46d9-4685-84c4-316c2d7feedb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.438 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] VM Started (Lifecycle Event)
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.465 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.471 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422595.4368613, b040151a-46d9-4685-84c4-316c2d7feedb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.471 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] VM Paused (Lifecycle Event)
Jan 26 10:16:35 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.485 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.537 226326 DEBUG nova.compute.manager [req-7a5e3ded-2640-460d-9dff-843b6d3062c9 req-fbd04b0f-84ca-4f86-b88a-fa49da3fb062 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Received event network-vif-plugged-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.538 226326 DEBUG oslo_concurrency.lockutils [req-7a5e3ded-2640-460d-9dff-843b6d3062c9 req-fbd04b0f-84ca-4f86-b88a-fa49da3fb062 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.538 226326 DEBUG oslo_concurrency.lockutils [req-7a5e3ded-2640-460d-9dff-843b6d3062c9 req-fbd04b0f-84ca-4f86-b88a-fa49da3fb062 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.538 226326 DEBUG oslo_concurrency.lockutils [req-7a5e3ded-2640-460d-9dff-843b6d3062c9 req-fbd04b0f-84ca-4f86-b88a-fa49da3fb062 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.539 226326 DEBUG nova.compute.manager [req-7a5e3ded-2640-460d-9dff-843b6d3062c9 req-fbd04b0f-84ca-4f86-b88a-fa49da3fb062 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Processing event network-vif-plugged-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.539 226326 DEBUG nova.compute.manager [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.543 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.545 226326 INFO nova.virt.libvirt.driver [-] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Instance spawned successfully.
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.545 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 26 10:16:35 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3220884915' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:16:35 compute-1 podman[238174]: 2026-01-26 10:16:35.551525709 +0000 UTC m=+0.049210247 container create ec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.557 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.564 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422595.5424285, b040151a-46d9-4685-84c4-316c2d7feedb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.564 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] VM Resumed (Lifecycle Event)
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.587 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.588 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.589 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.589 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.590 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.591 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.595 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.598 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 10:16:35 compute-1 systemd[1]: Started libpod-conmon-ec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a.scope.
Jan 26 10:16:35 compute-1 podman[238174]: 2026-01-26 10:16:35.525733864 +0000 UTC m=+0.023418442 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 26 10:16:35 compute-1 systemd[1]: Started libcrun container.
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.626 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 10:16:35 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78225d3fda0ba19bc05bebf3fc0755524da6232cfdd0544b56f1d421e656d4c8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 10:16:35 compute-1 podman[238174]: 2026-01-26 10:16:35.703884524 +0000 UTC m=+0.201569142 container init ec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 26 10:16:35 compute-1 podman[238174]: 2026-01-26 10:16:35.710749203 +0000 UTC m=+0.208433751 container start ec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 26 10:16:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:35.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.724 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:16:35 compute-1 neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e[238190]: [NOTICE]   (238194) : New worker (238196) forked
Jan 26 10:16:35 compute-1 neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e[238190]: [NOTICE]   (238194) : Loading success.
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.732 226326 INFO nova.compute.manager [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Took 6.96 seconds to spawn the instance on the hypervisor.
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.733 226326 DEBUG nova.compute.manager [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 10:16:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:35.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.842 226326 INFO nova.compute.manager [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Took 7.92 seconds to build instance.
Jan 26 10:16:35 compute-1 nova_compute[226322]: 2026-01-26 10:16:35.873 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:16:36 compute-1 nova_compute[226322]: 2026-01-26 10:16:36.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:16:36 compute-1 nova_compute[226322]: 2026-01-26 10:16:36.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:16:36 compute-1 nova_compute[226322]: 2026-01-26 10:16:36.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:16:36 compute-1 nova_compute[226322]: 2026-01-26 10:16:36.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:16:36 compute-1 ceph-mon[80107]: pgmap v1032: 353 pgs: 353 active+clean; 167 MiB data, 331 MiB used, 60 GiB / 60 GiB avail; 346 KiB/s rd, 3.9 MiB/s wr, 93 op/s
Jan 26 10:16:36 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/613783079' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:16:36 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:16:37 compute-1 nova_compute[226322]: 2026-01-26 10:16:37.212 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 10:16:37 compute-1 nova_compute[226322]: 2026-01-26 10:16:37.213 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquired lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 10:16:37 compute-1 nova_compute[226322]: 2026-01-26 10:16:37.213 226326 DEBUG nova.network.neutron [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 26 10:16:37 compute-1 nova_compute[226322]: 2026-01-26 10:16:37.213 226326 DEBUG nova.objects.instance [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b040151a-46d9-4685-84c4-316c2d7feedb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 10:16:37 compute-1 nova_compute[226322]: 2026-01-26 10:16:37.665 226326 DEBUG nova.compute.manager [req-2661df50-3103-4f8a-a995-04e222e11c3a req-565d1cb5-ba26-4ad4-9ead-33933eb521a0 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Received event network-vif-plugged-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:16:37 compute-1 nova_compute[226322]: 2026-01-26 10:16:37.666 226326 DEBUG oslo_concurrency.lockutils [req-2661df50-3103-4f8a-a995-04e222e11c3a req-565d1cb5-ba26-4ad4-9ead-33933eb521a0 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:16:37 compute-1 nova_compute[226322]: 2026-01-26 10:16:37.666 226326 DEBUG oslo_concurrency.lockutils [req-2661df50-3103-4f8a-a995-04e222e11c3a req-565d1cb5-ba26-4ad4-9ead-33933eb521a0 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:16:37 compute-1 nova_compute[226322]: 2026-01-26 10:16:37.666 226326 DEBUG oslo_concurrency.lockutils [req-2661df50-3103-4f8a-a995-04e222e11c3a req-565d1cb5-ba26-4ad4-9ead-33933eb521a0 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:16:37 compute-1 nova_compute[226322]: 2026-01-26 10:16:37.667 226326 DEBUG nova.compute.manager [req-2661df50-3103-4f8a-a995-04e222e11c3a req-565d1cb5-ba26-4ad4-9ead-33933eb521a0 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] No waiting events found dispatching network-vif-plugged-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 10:16:37 compute-1 nova_compute[226322]: 2026-01-26 10:16:37.667 226326 WARNING nova.compute.manager [req-2661df50-3103-4f8a-a995-04e222e11c3a req-565d1cb5-ba26-4ad4-9ead-33933eb521a0 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Received unexpected event network-vif-plugged-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae for instance with vm_state active and task_state None.
Jan 26 10:16:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:37.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:37 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1154178246' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:16:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:16:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:37.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:16:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:16:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:16:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:16:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:16:38 compute-1 ceph-mon[80107]: pgmap v1033: 353 pgs: 353 active+clean; 167 MiB data, 331 MiB used, 60 GiB / 60 GiB avail; 346 KiB/s rd, 3.9 MiB/s wr, 92 op/s
Jan 26 10:16:38 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1410766536' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:16:38 compute-1 nova_compute[226322]: 2026-01-26 10:16:38.872 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:39 compute-1 nova_compute[226322]: 2026-01-26 10:16:39.544 226326 DEBUG nova.network.neutron [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Updating instance_info_cache with network_info: [{"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:16:39 compute-1 nova_compute[226322]: 2026-01-26 10:16:39.565 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Releasing lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 10:16:39 compute-1 nova_compute[226322]: 2026-01-26 10:16:39.565 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 26 10:16:39 compute-1 nova_compute[226322]: 2026-01-26 10:16:39.565 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:16:39 compute-1 nova_compute[226322]: 2026-01-26 10:16:39.566 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:16:39 compute-1 nova_compute[226322]: 2026-01-26 10:16:39.566 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:16:39 compute-1 nova_compute[226322]: 2026-01-26 10:16:39.566 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:16:39 compute-1 nova_compute[226322]: 2026-01-26 10:16:39.566 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:16:39 compute-1 nova_compute[226322]: 2026-01-26 10:16:39.567 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:16:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:39.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:39.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:40 compute-1 nova_compute[226322]: 2026-01-26 10:16:40.017 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:41 compute-1 nova_compute[226322]: 2026-01-26 10:16:41.695 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:16:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:16:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:41.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:16:41 compute-1 ceph-mon[80107]: pgmap v1034: 353 pgs: 353 active+clean; 167 MiB data, 332 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 3.9 MiB/s wr, 167 op/s
Jan 26 10:16:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:16:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:41.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:16:41 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:16:42 compute-1 nova_compute[226322]: 2026-01-26 10:16:42.006 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:16:42 compute-1 nova_compute[226322]: 2026-01-26 10:16:42.007 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:16:42 compute-1 nova_compute[226322]: 2026-01-26 10:16:42.007 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:16:42 compute-1 nova_compute[226322]: 2026-01-26 10:16:42.007 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:16:42 compute-1 nova_compute[226322]: 2026-01-26 10:16:42.008 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:16:42 compute-1 nova_compute[226322]: 2026-01-26 10:16:42.354 226326 DEBUG nova.compute.manager [req-119c5248-4dfb-4f64-b664-5fca0fb5b06c req-1d2e32c8-3983-49a0-bb03-00e7b3cbd844 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Received event network-changed-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:16:42 compute-1 nova_compute[226322]: 2026-01-26 10:16:42.355 226326 DEBUG nova.compute.manager [req-119c5248-4dfb-4f64-b664-5fca0fb5b06c req-1d2e32c8-3983-49a0-bb03-00e7b3cbd844 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Refreshing instance network info cache due to event network-changed-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 10:16:42 compute-1 nova_compute[226322]: 2026-01-26 10:16:42.355 226326 DEBUG oslo_concurrency.lockutils [req-119c5248-4dfb-4f64-b664-5fca0fb5b06c req-1d2e32c8-3983-49a0-bb03-00e7b3cbd844 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 10:16:42 compute-1 nova_compute[226322]: 2026-01-26 10:16:42.355 226326 DEBUG oslo_concurrency.lockutils [req-119c5248-4dfb-4f64-b664-5fca0fb5b06c req-1d2e32c8-3983-49a0-bb03-00e7b3cbd844 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquired lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 10:16:42 compute-1 nova_compute[226322]: 2026-01-26 10:16:42.356 226326 DEBUG nova.network.neutron [req-119c5248-4dfb-4f64-b664-5fca0fb5b06c req-1d2e32c8-3983-49a0-bb03-00e7b3cbd844 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Refreshing network info cache for port e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 10:16:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:16:42 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3699931200' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:16:42 compute-1 nova_compute[226322]: 2026-01-26 10:16:42.480 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:16:42 compute-1 ceph-mon[80107]: pgmap v1035: 353 pgs: 353 active+clean; 167 MiB data, 332 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 120 KiB/s wr, 95 op/s
Jan 26 10:16:42 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3699931200' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:16:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:16:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:16:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:16:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:16:43 compute-1 nova_compute[226322]: 2026-01-26 10:16:43.032 226326 DEBUG nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] skipping disk for instance-0000000c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Jan 26 10:16:43 compute-1 nova_compute[226322]: 2026-01-26 10:16:43.033 226326 DEBUG nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] skipping disk for instance-0000000c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Jan 26 10:16:43 compute-1 nova_compute[226322]: 2026-01-26 10:16:43.236 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:16:43 compute-1 nova_compute[226322]: 2026-01-26 10:16:43.237 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4670MB free_disk=59.92185592651367GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:16:43 compute-1 nova_compute[226322]: 2026-01-26 10:16:43.237 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:16:43 compute-1 nova_compute[226322]: 2026-01-26 10:16:43.238 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:16:43 compute-1 podman[238231]: 2026-01-26 10:16:43.406163011 +0000 UTC m=+0.161907938 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 10:16:43 compute-1 nova_compute[226322]: 2026-01-26 10:16:43.440 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Instance b040151a-46d9-4685-84c4-316c2d7feedb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 10:16:43 compute-1 nova_compute[226322]: 2026-01-26 10:16:43.442 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:16:43 compute-1 nova_compute[226322]: 2026-01-26 10:16:43.442 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:16:43 compute-1 nova_compute[226322]: 2026-01-26 10:16:43.493 226326 DEBUG nova.network.neutron [req-119c5248-4dfb-4f64-b664-5fca0fb5b06c req-1d2e32c8-3983-49a0-bb03-00e7b3cbd844 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Updated VIF entry in instance network info cache for port e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 10:16:43 compute-1 nova_compute[226322]: 2026-01-26 10:16:43.494 226326 DEBUG nova.network.neutron [req-119c5248-4dfb-4f64-b664-5fca0fb5b06c req-1d2e32c8-3983-49a0-bb03-00e7b3cbd844 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Updating instance_info_cache with network_info: [{"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:16:43 compute-1 nova_compute[226322]: 2026-01-26 10:16:43.570 226326 DEBUG oslo_concurrency.lockutils [req-119c5248-4dfb-4f64-b664-5fca0fb5b06c req-1d2e32c8-3983-49a0-bb03-00e7b3cbd844 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Releasing lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 10:16:43 compute-1 nova_compute[226322]: 2026-01-26 10:16:43.610 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:16:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:43.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:43.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:43 compute-1 nova_compute[226322]: 2026-01-26 10:16:43.874 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:44 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:16:44 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3339811741' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:16:44 compute-1 nova_compute[226322]: 2026-01-26 10:16:44.116 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:16:44 compute-1 nova_compute[226322]: 2026-01-26 10:16:44.120 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:16:44 compute-1 nova_compute[226322]: 2026-01-26 10:16:44.221 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:16:44 compute-1 nova_compute[226322]: 2026-01-26 10:16:44.264 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:16:44 compute-1 nova_compute[226322]: 2026-01-26 10:16:44.265 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.027s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:16:45 compute-1 nova_compute[226322]: 2026-01-26 10:16:45.017 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:45 compute-1 ceph-mon[80107]: pgmap v1036: 353 pgs: 353 active+clean; 167 MiB data, 332 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 120 KiB/s wr, 95 op/s
Jan 26 10:16:45 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3339811741' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:16:45 compute-1 sudo[238279]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:16:45 compute-1 sudo[238279]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:16:45 compute-1 sudo[238279]: pam_unix(sudo:session): session closed for user root
Jan 26 10:16:45 compute-1 sudo[238306]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:16:45 compute-1 sudo[238304]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:16:45 compute-1 sudo[238306]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:16:45 compute-1 sudo[238304]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:16:45 compute-1 sudo[238304]: pam_unix(sudo:session): session closed for user root
Jan 26 10:16:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000054s ======
Jan 26 10:16:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:45.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 26 10:16:45 compute-1 sudo[238306]: pam_unix(sudo:session): session closed for user root
Jan 26 10:16:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:45.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:46 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:16:46 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:16:46 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:16:46 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:16:47 compute-1 ceph-mon[80107]: pgmap v1037: 353 pgs: 353 active+clean; 167 MiB data, 332 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 120 KiB/s wr, 96 op/s
Jan 26 10:16:47 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:16:47 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:16:47 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:16:47 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:16:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:47.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:47.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:16:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:16:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:16:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:16:48 compute-1 ceph-mon[80107]: pgmap v1038: 353 pgs: 353 active+clean; 167 MiB data, 332 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 25 KiB/s wr, 74 op/s
Jan 26 10:16:48 compute-1 ovn_controller[133670]: 2026-01-26T10:16:48Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:97:1d:9d 10.100.0.12
Jan 26 10:16:48 compute-1 ovn_controller[133670]: 2026-01-26T10:16:48Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:97:1d:9d 10.100.0.12
Jan 26 10:16:48 compute-1 nova_compute[226322]: 2026-01-26 10:16:48.877 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:16:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:49.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:16:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:49.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:16:50 compute-1 nova_compute[226322]: 2026-01-26 10:16:50.018 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:50 compute-1 ceph-mon[80107]: pgmap v1039: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.2 MiB/s wr, 139 op/s
Jan 26 10:16:50 compute-1 sudo[238388]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:16:50 compute-1 sudo[238388]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:16:50 compute-1 sudo[238388]: pam_unix(sudo:session): session closed for user root
Jan 26 10:16:51 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:16:51 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:16:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:51.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:16:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:51.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:16:51 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:16:52 compute-1 ceph-mon[80107]: pgmap v1040: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 329 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Jan 26 10:16:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:16:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:16:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:16:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:16:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:16:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:53.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:16:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:53.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:53 compute-1 nova_compute[226322]: 2026-01-26 10:16:53.888 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:53.940 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:16:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:53.941 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:16:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:16:53.941 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:16:54 compute-1 ceph-mon[80107]: pgmap v1041: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 329 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Jan 26 10:16:55 compute-1 nova_compute[226322]: 2026-01-26 10:16:55.066 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:55.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:16:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:55.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:16:56 compute-1 ceph-mon[80107]: pgmap v1042: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 329 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Jan 26 10:16:56 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:16:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:57.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:57.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:16:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:16:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:16:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:16:58 compute-1 nova_compute[226322]: 2026-01-26 10:16:58.891 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:16:59 compute-1 ceph-mon[80107]: pgmap v1043: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 328 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Jan 26 10:16:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/653039017' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:16:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/653039017' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:16:59 compute-1 podman[238417]: 2026-01-26 10:16:59.337895497 +0000 UTC m=+0.106639256 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 26 10:16:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:16:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:59.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:16:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:16:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:16:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:59.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:17:00 compute-1 nova_compute[226322]: 2026-01-26 10:17:00.067 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:00 compute-1 ceph-mon[80107]: pgmap v1044: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 335 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Jan 26 10:17:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:01.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:17:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:01.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:17:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:17:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:17:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:17:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:17:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:17:03 compute-1 ceph-mon[80107]: pgmap v1045: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 7.7 KiB/s rd, 12 KiB/s wr, 2 op/s
Jan 26 10:17:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:03.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:03.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:03 compute-1 nova_compute[226322]: 2026-01-26 10:17:03.893 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:17:04 compute-1 nova_compute[226322]: 2026-01-26 10:17:04.800 226326 DEBUG oslo_concurrency.lockutils [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "b040151a-46d9-4685-84c4-316c2d7feedb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:17:04 compute-1 nova_compute[226322]: 2026-01-26 10:17:04.801 226326 DEBUG oslo_concurrency.lockutils [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:17:04 compute-1 nova_compute[226322]: 2026-01-26 10:17:04.802 226326 DEBUG oslo_concurrency.lockutils [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:17:04 compute-1 nova_compute[226322]: 2026-01-26 10:17:04.802 226326 DEBUG oslo_concurrency.lockutils [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:17:04 compute-1 nova_compute[226322]: 2026-01-26 10:17:04.802 226326 DEBUG oslo_concurrency.lockutils [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:17:04 compute-1 nova_compute[226322]: 2026-01-26 10:17:04.803 226326 INFO nova.compute.manager [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Terminating instance
Jan 26 10:17:04 compute-1 nova_compute[226322]: 2026-01-26 10:17:04.804 226326 DEBUG nova.compute.manager [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 10:17:04 compute-1 kernel: tape8f0e0cf-36 (unregistering): left promiscuous mode
Jan 26 10:17:04 compute-1 NetworkManager[49073]: <info>  [1769422624.8559] device (tape8f0e0cf-36): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 10:17:04 compute-1 ovn_controller[133670]: 2026-01-26T10:17:04Z|00072|binding|INFO|Releasing lport e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae from this chassis (sb_readonly=0)
Jan 26 10:17:04 compute-1 ovn_controller[133670]: 2026-01-26T10:17:04Z|00073|binding|INFO|Setting lport e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae down in Southbound
Jan 26 10:17:04 compute-1 nova_compute[226322]: 2026-01-26 10:17:04.902 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:04 compute-1 ovn_controller[133670]: 2026-01-26T10:17:04Z|00074|binding|INFO|Removing iface tape8f0e0cf-36 ovn-installed in OVS
Jan 26 10:17:04 compute-1 nova_compute[226322]: 2026-01-26 10:17:04.904 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:04 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:17:04.911 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:1d:9d 10.100.0.12'], port_security=['fa:16:3e:97:1d:9d 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'b040151a-46d9-4685-84c4-316c2d7feedb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5cce9ca3-082e-4a27-8023-5db6a50012d5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db1eda71-392f-4d4b-8724-78530674037e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], logical_port=e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 10:17:04 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:17:04.912 143326 INFO neutron.agent.ovn.metadata.agent [-] Port e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae in datapath 9bff64e0-694f-4b2d-b4b5-5e3b1d94460e unbound from our chassis
Jan 26 10:17:04 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:17:04.913 143326 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9bff64e0-694f-4b2d-b4b5-5e3b1d94460e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 10:17:04 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:17:04.914 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[d4e40cd8-d02c-4ac5-b4a6-5eb34bfd1bfb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:17:04 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:17:04.915 143326 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e namespace which is not needed anymore
Jan 26 10:17:04 compute-1 nova_compute[226322]: 2026-01-26 10:17:04.920 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:04 compute-1 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Jan 26 10:17:04 compute-1 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000c.scope: Consumed 14.093s CPU time.
Jan 26 10:17:04 compute-1 systemd-machined[194876]: Machine qemu-5-instance-0000000c terminated.
Jan 26 10:17:05 compute-1 nova_compute[226322]: 2026-01-26 10:17:05.039 226326 INFO nova.virt.libvirt.driver [-] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Instance destroyed successfully.
Jan 26 10:17:05 compute-1 nova_compute[226322]: 2026-01-26 10:17:05.040 226326 DEBUG nova.objects.instance [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'resources' on Instance uuid b040151a-46d9-4685-84c4-316c2d7feedb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 10:17:05 compute-1 neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e[238190]: [NOTICE]   (238194) : haproxy version is 2.8.14-c23fe91
Jan 26 10:17:05 compute-1 neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e[238190]: [NOTICE]   (238194) : path to executable is /usr/sbin/haproxy
Jan 26 10:17:05 compute-1 neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e[238190]: [WARNING]  (238194) : Exiting Master process...
Jan 26 10:17:05 compute-1 neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e[238190]: [ALERT]    (238194) : Current worker (238196) exited with code 143 (Terminated)
Jan 26 10:17:05 compute-1 neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e[238190]: [WARNING]  (238194) : All workers exited. Exiting... (0)
Jan 26 10:17:05 compute-1 systemd[1]: libpod-ec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a.scope: Deactivated successfully.
Jan 26 10:17:05 compute-1 podman[238464]: 2026-01-26 10:17:05.054030205 +0000 UTC m=+0.052063434 container died ec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 26 10:17:05 compute-1 nova_compute[226322]: 2026-01-26 10:17:05.069 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:05 compute-1 ceph-mon[80107]: pgmap v1046: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 7.7 KiB/s rd, 12 KiB/s wr, 2 op/s
Jan 26 10:17:05 compute-1 nova_compute[226322]: 2026-01-26 10:17:05.305 226326 DEBUG nova.virt.libvirt.vif [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T10:16:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1174728540',display_name='tempest-TestNetworkBasicOps-server-1174728540',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1174728540',id=12,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB+EGw3c7uXEi4OTNwulYSgVH+l4n13lnOEqRefy0kW9/M5qFO/QglqfVEv83QxC3p6/WVWNJKRmKF5HdJHv8FbQly+oOPz0zT0wKKx+uweCnuuDtsTED/V9sqhB977UJw==',key_name='tempest-TestNetworkBasicOps-1514805345',keypairs=<?>,launch_index=0,launched_at=2026-01-26T10:16:35Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-hl8103a0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T10:16:35Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=b040151a-46d9-4685-84c4-316c2d7feedb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 10:17:05 compute-1 nova_compute[226322]: 2026-01-26 10:17:05.306 226326 DEBUG nova.network.os_vif_util [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 10:17:05 compute-1 nova_compute[226322]: 2026-01-26 10:17:05.306 226326 DEBUG nova.network.os_vif_util [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:97:1d:9d,bridge_name='br-int',has_traffic_filtering=True,id=e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae,network=Network(9bff64e0-694f-4b2d-b4b5-5e3b1d94460e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape8f0e0cf-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 10:17:05 compute-1 nova_compute[226322]: 2026-01-26 10:17:05.307 226326 DEBUG os_vif [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:97:1d:9d,bridge_name='br-int',has_traffic_filtering=True,id=e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae,network=Network(9bff64e0-694f-4b2d-b4b5-5e3b1d94460e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape8f0e0cf-36') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 10:17:05 compute-1 nova_compute[226322]: 2026-01-26 10:17:05.308 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:05 compute-1 nova_compute[226322]: 2026-01-26 10:17:05.308 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape8f0e0cf-36, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:17:05 compute-1 nova_compute[226322]: 2026-01-26 10:17:05.310 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:05 compute-1 nova_compute[226322]: 2026-01-26 10:17:05.311 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:05 compute-1 nova_compute[226322]: 2026-01-26 10:17:05.313 226326 INFO os_vif [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:97:1d:9d,bridge_name='br-int',has_traffic_filtering=True,id=e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae,network=Network(9bff64e0-694f-4b2d-b4b5-5e3b1d94460e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape8f0e0cf-36')
Jan 26 10:17:05 compute-1 sudo[238503]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:17:05 compute-1 sudo[238503]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:17:05 compute-1 sudo[238503]: pam_unix(sudo:session): session closed for user root
Jan 26 10:17:05 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a-userdata-shm.mount: Deactivated successfully.
Jan 26 10:17:05 compute-1 systemd[1]: var-lib-containers-storage-overlay-78225d3fda0ba19bc05bebf3fc0755524da6232cfdd0544b56f1d421e656d4c8-merged.mount: Deactivated successfully.
Jan 26 10:17:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:05.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:05 compute-1 nova_compute[226322]: 2026-01-26 10:17:05.826 226326 DEBUG nova.compute.manager [req-95e36ca9-58bb-4cdc-8282-20d365e5cabd req-ed5aef2f-5856-492c-b86c-f85d004b5e70 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Received event network-vif-unplugged-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:17:05 compute-1 nova_compute[226322]: 2026-01-26 10:17:05.827 226326 DEBUG oslo_concurrency.lockutils [req-95e36ca9-58bb-4cdc-8282-20d365e5cabd req-ed5aef2f-5856-492c-b86c-f85d004b5e70 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:17:05 compute-1 nova_compute[226322]: 2026-01-26 10:17:05.827 226326 DEBUG oslo_concurrency.lockutils [req-95e36ca9-58bb-4cdc-8282-20d365e5cabd req-ed5aef2f-5856-492c-b86c-f85d004b5e70 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:17:05 compute-1 nova_compute[226322]: 2026-01-26 10:17:05.827 226326 DEBUG oslo_concurrency.lockutils [req-95e36ca9-58bb-4cdc-8282-20d365e5cabd req-ed5aef2f-5856-492c-b86c-f85d004b5e70 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:17:05 compute-1 nova_compute[226322]: 2026-01-26 10:17:05.827 226326 DEBUG nova.compute.manager [req-95e36ca9-58bb-4cdc-8282-20d365e5cabd req-ed5aef2f-5856-492c-b86c-f85d004b5e70 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] No waiting events found dispatching network-vif-unplugged-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 10:17:05 compute-1 nova_compute[226322]: 2026-01-26 10:17:05.828 226326 DEBUG nova.compute.manager [req-95e36ca9-58bb-4cdc-8282-20d365e5cabd req-ed5aef2f-5856-492c-b86c-f85d004b5e70 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Received event network-vif-unplugged-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 26 10:17:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:05.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:05 compute-1 sshd-session[238441]: Invalid user centos from 104.248.198.71 port 49936
Jan 26 10:17:06 compute-1 nova_compute[226322]: 2026-01-26 10:17:06.026 226326 DEBUG nova.compute.manager [req-9332e68d-7895-479b-937d-a13200fb5d82 req-723294d8-0094-4732-9a05-ba6b5572064c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Received event network-changed-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:17:06 compute-1 nova_compute[226322]: 2026-01-26 10:17:06.026 226326 DEBUG nova.compute.manager [req-9332e68d-7895-479b-937d-a13200fb5d82 req-723294d8-0094-4732-9a05-ba6b5572064c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Refreshing instance network info cache due to event network-changed-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 10:17:06 compute-1 nova_compute[226322]: 2026-01-26 10:17:06.027 226326 DEBUG oslo_concurrency.lockutils [req-9332e68d-7895-479b-937d-a13200fb5d82 req-723294d8-0094-4732-9a05-ba6b5572064c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 10:17:06 compute-1 nova_compute[226322]: 2026-01-26 10:17:06.027 226326 DEBUG oslo_concurrency.lockutils [req-9332e68d-7895-479b-937d-a13200fb5d82 req-723294d8-0094-4732-9a05-ba6b5572064c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquired lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 10:17:06 compute-1 nova_compute[226322]: 2026-01-26 10:17:06.027 226326 DEBUG nova.network.neutron [req-9332e68d-7895-479b-937d-a13200fb5d82 req-723294d8-0094-4732-9a05-ba6b5572064c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Refreshing network info cache for port e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 10:17:06 compute-1 podman[238464]: 2026-01-26 10:17:06.031232945 +0000 UTC m=+1.029266174 container cleanup ec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 10:17:06 compute-1 systemd[1]: libpod-conmon-ec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a.scope: Deactivated successfully.
Jan 26 10:17:06 compute-1 ceph-mon[80107]: pgmap v1047: 353 pgs: 353 active+clean; 200 MiB data, 365 MiB used, 60 GiB / 60 GiB avail; 8.2 KiB/s rd, 19 KiB/s wr, 3 op/s
Jan 26 10:17:06 compute-1 podman[238551]: 2026-01-26 10:17:06.124607349 +0000 UTC m=+0.065644216 container remove ec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 26 10:17:06 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:17:06.136 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[284337f3-ec71-428e-ab97-6b66626eb86c]: (4, ('Mon Jan 26 10:17:04 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e (ec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a)\nec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a\nMon Jan 26 10:17:06 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e (ec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a)\nec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:17:06 compute-1 sshd-session[238441]: Connection closed by invalid user centos 104.248.198.71 port 49936 [preauth]
Jan 26 10:17:06 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:17:06.139 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[d8b9518e-5438-42d9-a4a4-4209932f3089]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:17:06 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:17:06.141 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9bff64e0-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:17:06 compute-1 nova_compute[226322]: 2026-01-26 10:17:06.144 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:06 compute-1 kernel: tap9bff64e0-60: left promiscuous mode
Jan 26 10:17:06 compute-1 nova_compute[226322]: 2026-01-26 10:17:06.147 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:06 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:17:06.150 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[bf9aa140-895a-40f2-af9a-18c9047da115]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:17:06 compute-1 nova_compute[226322]: 2026-01-26 10:17:06.162 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:06 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:17:06.164 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[a8816bea-5ea9-4304-8c60-686badd6ce65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:17:06 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:17:06.165 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[3d4f472e-3efd-4fad-afea-badcbd26a7e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:17:06 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:17:06.192 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[c1ddd192-cab8-4916-9476-c4657d0105b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455552, 'reachable_time': 30589, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238567, 'error': None, 'target': 'ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:17:06 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:17:06.196 143615 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 26 10:17:06 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:17:06.196 143615 DEBUG oslo.privsep.daemon [-] privsep: reply[b4d3ed7b-e77f-45b2-83ca-5e000a1e77e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 10:17:06 compute-1 systemd[1]: run-netns-ovnmeta\x2d9bff64e0\x2d694f\x2d4b2d\x2db4b5\x2d5e3b1d94460e.mount: Deactivated successfully.
Jan 26 10:17:06 compute-1 nova_compute[226322]: 2026-01-26 10:17:06.393 226326 INFO nova.virt.libvirt.driver [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Deleting instance files /var/lib/nova/instances/b040151a-46d9-4685-84c4-316c2d7feedb_del
Jan 26 10:17:06 compute-1 nova_compute[226322]: 2026-01-26 10:17:06.394 226326 INFO nova.virt.libvirt.driver [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Deletion of /var/lib/nova/instances/b040151a-46d9-4685-84c4-316c2d7feedb_del complete
Jan 26 10:17:06 compute-1 nova_compute[226322]: 2026-01-26 10:17:06.447 226326 INFO nova.compute.manager [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Took 1.64 seconds to destroy the instance on the hypervisor.
Jan 26 10:17:06 compute-1 nova_compute[226322]: 2026-01-26 10:17:06.448 226326 DEBUG oslo.service.loopingcall [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 26 10:17:06 compute-1 nova_compute[226322]: 2026-01-26 10:17:06.448 226326 DEBUG nova.compute.manager [-] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 10:17:06 compute-1 nova_compute[226322]: 2026-01-26 10:17:06.448 226326 DEBUG nova.network.neutron [-] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 10:17:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:17:07 compute-1 nova_compute[226322]: 2026-01-26 10:17:07.564 226326 DEBUG nova.network.neutron [-] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:17:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:17:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:07.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:17:07 compute-1 nova_compute[226322]: 2026-01-26 10:17:07.846 226326 INFO nova.compute.manager [-] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Took 1.40 seconds to deallocate network for instance.
Jan 26 10:17:07 compute-1 nova_compute[226322]: 2026-01-26 10:17:07.857 226326 DEBUG nova.network.neutron [req-9332e68d-7895-479b-937d-a13200fb5d82 req-723294d8-0094-4732-9a05-ba6b5572064c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Updated VIF entry in instance network info cache for port e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 10:17:07 compute-1 nova_compute[226322]: 2026-01-26 10:17:07.858 226326 DEBUG nova.network.neutron [req-9332e68d-7895-479b-937d-a13200fb5d82 req-723294d8-0094-4732-9a05-ba6b5572064c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Updating instance_info_cache with network_info: [{"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:17:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:17:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:07.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:17:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:17:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:17:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:17:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:17:08 compute-1 nova_compute[226322]: 2026-01-26 10:17:08.052 226326 DEBUG nova.compute.manager [req-876d7fa6-f212-49a1-b153-9f872622ea70 req-59e3e6aa-9bd3-4a89-a4c0-e9b85c1645b3 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Received event network-vif-plugged-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:17:08 compute-1 nova_compute[226322]: 2026-01-26 10:17:08.053 226326 DEBUG oslo_concurrency.lockutils [req-876d7fa6-f212-49a1-b153-9f872622ea70 req-59e3e6aa-9bd3-4a89-a4c0-e9b85c1645b3 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:17:08 compute-1 nova_compute[226322]: 2026-01-26 10:17:08.053 226326 DEBUG oslo_concurrency.lockutils [req-876d7fa6-f212-49a1-b153-9f872622ea70 req-59e3e6aa-9bd3-4a89-a4c0-e9b85c1645b3 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:17:08 compute-1 nova_compute[226322]: 2026-01-26 10:17:08.054 226326 DEBUG oslo_concurrency.lockutils [req-876d7fa6-f212-49a1-b153-9f872622ea70 req-59e3e6aa-9bd3-4a89-a4c0-e9b85c1645b3 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:17:08 compute-1 nova_compute[226322]: 2026-01-26 10:17:08.054 226326 DEBUG nova.compute.manager [req-876d7fa6-f212-49a1-b153-9f872622ea70 req-59e3e6aa-9bd3-4a89-a4c0-e9b85c1645b3 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] No waiting events found dispatching network-vif-plugged-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 10:17:08 compute-1 nova_compute[226322]: 2026-01-26 10:17:08.055 226326 WARNING nova.compute.manager [req-876d7fa6-f212-49a1-b153-9f872622ea70 req-59e3e6aa-9bd3-4a89-a4c0-e9b85c1645b3 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Received unexpected event network-vif-plugged-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae for instance with vm_state active and task_state deleting.
Jan 26 10:17:08 compute-1 nova_compute[226322]: 2026-01-26 10:17:08.092 226326 DEBUG oslo_concurrency.lockutils [req-9332e68d-7895-479b-937d-a13200fb5d82 req-723294d8-0094-4732-9a05-ba6b5572064c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Releasing lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 10:17:08 compute-1 nova_compute[226322]: 2026-01-26 10:17:08.101 226326 DEBUG oslo_concurrency.lockutils [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:17:08 compute-1 nova_compute[226322]: 2026-01-26 10:17:08.102 226326 DEBUG oslo_concurrency.lockutils [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:17:08 compute-1 nova_compute[226322]: 2026-01-26 10:17:08.176 226326 DEBUG oslo_concurrency.processutils [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:17:08 compute-1 nova_compute[226322]: 2026-01-26 10:17:08.271 226326 DEBUG nova.compute.manager [req-d886c676-4eac-449e-b225-63b6578784ef req-065daaa1-5dab-43eb-9f7c-fe1991596bc7 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Received event network-vif-deleted-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 10:17:08 compute-1 nova_compute[226322]: 2026-01-26 10:17:08.272 226326 INFO nova.compute.manager [req-d886c676-4eac-449e-b225-63b6578784ef req-065daaa1-5dab-43eb-9f7c-fe1991596bc7 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Neutron deleted interface e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae; detaching it from the instance and deleting it from the info cache
Jan 26 10:17:08 compute-1 nova_compute[226322]: 2026-01-26 10:17:08.272 226326 DEBUG nova.network.neutron [req-d886c676-4eac-449e-b225-63b6578784ef req-065daaa1-5dab-43eb-9f7c-fe1991596bc7 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 10:17:08 compute-1 nova_compute[226322]: 2026-01-26 10:17:08.293 226326 DEBUG nova.compute.manager [req-d886c676-4eac-449e-b225-63b6578784ef req-065daaa1-5dab-43eb-9f7c-fe1991596bc7 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Detach interface failed, port_id=e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae, reason: Instance b040151a-46d9-4685-84c4-316c2d7feedb could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 26 10:17:08 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:17:08 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1907053133' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:17:08 compute-1 nova_compute[226322]: 2026-01-26 10:17:08.613 226326 DEBUG oslo_concurrency.processutils [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:17:08 compute-1 nova_compute[226322]: 2026-01-26 10:17:08.618 226326 DEBUG nova.compute.provider_tree [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:17:08 compute-1 nova_compute[226322]: 2026-01-26 10:17:08.822 226326 DEBUG nova.scheduler.client.report [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:17:08 compute-1 nova_compute[226322]: 2026-01-26 10:17:08.861 226326 DEBUG oslo_concurrency.lockutils [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.760s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:17:08 compute-1 nova_compute[226322]: 2026-01-26 10:17:08.896 226326 INFO nova.scheduler.client.report [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Deleted allocations for instance b040151a-46d9-4685-84c4-316c2d7feedb
Jan 26 10:17:09 compute-1 ceph-mon[80107]: pgmap v1048: 353 pgs: 353 active+clean; 200 MiB data, 365 MiB used, 60 GiB / 60 GiB avail; 7.7 KiB/s rd, 6.7 KiB/s wr, 2 op/s
Jan 26 10:17:09 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1907053133' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:17:09 compute-1 nova_compute[226322]: 2026-01-26 10:17:09.208 226326 DEBUG oslo_concurrency.lockutils [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.408s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:17:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:17:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:09.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:17:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:17:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:09.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:17:10 compute-1 nova_compute[226322]: 2026-01-26 10:17:10.070 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:10 compute-1 nova_compute[226322]: 2026-01-26 10:17:10.311 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:10 compute-1 ceph-mon[80107]: pgmap v1049: 353 pgs: 353 active+clean; 121 MiB data, 321 MiB used, 60 GiB / 60 GiB avail; 27 KiB/s rd, 8.5 KiB/s wr, 30 op/s
Jan 26 10:17:10 compute-1 sshd-session[238594]: Invalid user admin from 152.42.136.110 port 49984
Jan 26 10:17:10 compute-1 sshd-session[238594]: Connection closed by invalid user admin 152.42.136.110 port 49984 [preauth]
Jan 26 10:17:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:17:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:11.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:17:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:17:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:11.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:17:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:17:12 compute-1 ceph-mon[80107]: pgmap v1050: 353 pgs: 353 active+clean; 121 MiB data, 321 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 8.5 KiB/s wr, 29 op/s
Jan 26 10:17:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:17:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:17:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:17:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:17:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:13.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:17:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:13.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:17:14 compute-1 podman[238598]: 2026-01-26 10:17:14.297879622 +0000 UTC m=+0.079418122 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 26 10:17:14 compute-1 ceph-mon[80107]: pgmap v1051: 353 pgs: 353 active+clean; 121 MiB data, 321 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 8.5 KiB/s wr, 29 op/s
Jan 26 10:17:15 compute-1 nova_compute[226322]: 2026-01-26 10:17:15.075 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:15 compute-1 nova_compute[226322]: 2026-01-26 10:17:15.312 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:15.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:17:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:15.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:17:16 compute-1 sshd-session[238625]: Invalid user admin from 178.62.249.31 port 35108
Jan 26 10:17:16 compute-1 sshd-session[238625]: Connection closed by invalid user admin 178.62.249.31 port 35108 [preauth]
Jan 26 10:17:16 compute-1 ceph-mon[80107]: pgmap v1052: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 32 KiB/s rd, 9.5 KiB/s wr, 48 op/s
Jan 26 10:17:17 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:17:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:17.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:17.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:17:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:17:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:17:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:17:18 compute-1 ceph-mon[80107]: pgmap v1053: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 32 KiB/s rd, 2.8 KiB/s wr, 46 op/s
Jan 26 10:17:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:17:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:19.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:19.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:20 compute-1 nova_compute[226322]: 2026-01-26 10:17:20.038 226326 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769422625.0362768, b040151a-46d9-4685-84c4-316c2d7feedb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 10:17:20 compute-1 nova_compute[226322]: 2026-01-26 10:17:20.038 226326 INFO nova.compute.manager [-] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] VM Stopped (Lifecycle Event)
Jan 26 10:17:20 compute-1 nova_compute[226322]: 2026-01-26 10:17:20.061 226326 DEBUG nova.compute.manager [None req-93449595-6e98-4855-a091-dc9e11be79d5 - - - - - -] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 10:17:20 compute-1 nova_compute[226322]: 2026-01-26 10:17:20.076 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:20 compute-1 nova_compute[226322]: 2026-01-26 10:17:20.314 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:20 compute-1 ceph-mon[80107]: pgmap v1054: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 39 KiB/s rd, 3.4 KiB/s wr, 57 op/s
Jan 26 10:17:20 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3993876434' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:17:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:17:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:21.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:17:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:21.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:17:22 compute-1 ceph-mon[80107]: pgmap v1055: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.6 KiB/s wr, 28 op/s
Jan 26 10:17:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:17:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:17:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:17:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:17:23 compute-1 nova_compute[226322]: 2026-01-26 10:17:23.737 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:23 compute-1 nova_compute[226322]: 2026-01-26 10:17:23.848 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:17:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:23.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:17:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:23.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:25 compute-1 nova_compute[226322]: 2026-01-26 10:17:25.103 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:25 compute-1 nova_compute[226322]: 2026-01-26 10:17:25.315 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:25 compute-1 ceph-mon[80107]: pgmap v1056: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.6 KiB/s wr, 28 op/s
Jan 26 10:17:25 compute-1 sudo[238634]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:17:25 compute-1 sudo[238634]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:17:25 compute-1 sudo[238634]: pam_unix(sudo:session): session closed for user root
Jan 26 10:17:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:17:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:25.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:17:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:25.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:26 compute-1 ceph-mon[80107]: pgmap v1057: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.6 KiB/s wr, 29 op/s
Jan 26 10:17:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:17:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:27.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:27.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:17:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:17:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:17:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:17:29 compute-1 ceph-mon[80107]: pgmap v1058: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 8.0 KiB/s rd, 597 B/s wr, 11 op/s
Jan 26 10:17:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:29.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:29.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:30 compute-1 nova_compute[226322]: 2026-01-26 10:17:30.105 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:30 compute-1 ceph-mon[80107]: pgmap v1059: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 8.5 KiB/s rd, 597 B/s wr, 11 op/s
Jan 26 10:17:30 compute-1 podman[238662]: 2026-01-26 10:17:30.274938641 +0000 UTC m=+0.049825171 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 26 10:17:30 compute-1 nova_compute[226322]: 2026-01-26 10:17:30.316 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:31.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:17:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:31.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:17:32 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:17:32.054 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 10:17:32 compute-1 nova_compute[226322]: 2026-01-26 10:17:32.054 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:32 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:17:32.055 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 10:17:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:17:32 compute-1 ceph-mon[80107]: pgmap v1060: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:17:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:17:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:17:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:17:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:17:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:33.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:17:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:33.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:17:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:17:35 compute-1 ceph-mon[80107]: pgmap v1061: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:17:35 compute-1 nova_compute[226322]: 2026-01-26 10:17:35.106 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:35 compute-1 nova_compute[226322]: 2026-01-26 10:17:35.318 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:35.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:35.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:36 compute-1 nova_compute[226322]: 2026-01-26 10:17:36.256 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:17:36 compute-1 nova_compute[226322]: 2026-01-26 10:17:36.257 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:17:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:17:37 compute-1 ceph-mon[80107]: pgmap v1062: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:17:37 compute-1 nova_compute[226322]: 2026-01-26 10:17:37.682 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:17:37 compute-1 nova_compute[226322]: 2026-01-26 10:17:37.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:17:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:37.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:37.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:17:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:17:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:17:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:17:38 compute-1 ceph-mon[80107]: pgmap v1063: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:17:38 compute-1 nova_compute[226322]: 2026-01-26 10:17:38.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:17:38 compute-1 nova_compute[226322]: 2026-01-26 10:17:38.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:17:38 compute-1 nova_compute[226322]: 2026-01-26 10:17:38.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:17:38 compute-1 nova_compute[226322]: 2026-01-26 10:17:38.701 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 10:17:38 compute-1 nova_compute[226322]: 2026-01-26 10:17:38.701 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:17:38 compute-1 nova_compute[226322]: 2026-01-26 10:17:38.701 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:17:38 compute-1 nova_compute[226322]: 2026-01-26 10:17:38.702 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:17:39 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:17:39.058 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:17:39 compute-1 nova_compute[226322]: 2026-01-26 10:17:39.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:17:39 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/534014619' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:17:39 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/249493324' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:17:39 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2622083868' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:17:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:39.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:39.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:40 compute-1 nova_compute[226322]: 2026-01-26 10:17:40.108 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:40 compute-1 nova_compute[226322]: 2026-01-26 10:17:40.320 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:40 compute-1 nova_compute[226322]: 2026-01-26 10:17:40.682 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:17:41 compute-1 ceph-mon[80107]: pgmap v1064: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:17:41 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3657476507' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:17:41 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3422539289' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:17:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:41.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:41.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:17:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:17:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:17:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:17:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:17:43 compute-1 nova_compute[226322]: 2026-01-26 10:17:43.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:17:43 compute-1 ceph-mon[80107]: pgmap v1065: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:17:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:43.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:43.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:44 compute-1 nova_compute[226322]: 2026-01-26 10:17:44.079 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:17:44 compute-1 nova_compute[226322]: 2026-01-26 10:17:44.080 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:17:44 compute-1 nova_compute[226322]: 2026-01-26 10:17:44.080 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:17:44 compute-1 nova_compute[226322]: 2026-01-26 10:17:44.080 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:17:44 compute-1 nova_compute[226322]: 2026-01-26 10:17:44.081 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:17:44 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:17:44 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1005788338' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:17:44 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Jan 26 10:17:44 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:44.920546) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 10:17:44 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Jan 26 10:17:44 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422664920916, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2389, "num_deletes": 251, "total_data_size": 6512990, "memory_usage": 6591552, "flush_reason": "Manual Compaction"}
Jan 26 10:17:44 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Jan 26 10:17:44 compute-1 ceph-mon[80107]: pgmap v1066: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:17:44 compute-1 nova_compute[226322]: 2026-01-26 10:17:44.928 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.847s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:17:44 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422664952430, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 4191624, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31238, "largest_seqno": 33621, "table_properties": {"data_size": 4181901, "index_size": 6153, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20235, "raw_average_key_size": 20, "raw_value_size": 4162471, "raw_average_value_size": 4221, "num_data_blocks": 264, "num_entries": 986, "num_filter_entries": 986, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769422455, "oldest_key_time": 1769422455, "file_creation_time": 1769422664, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:17:44 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 31766 microseconds, and 13285 cpu microseconds.
Jan 26 10:17:44 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:17:44 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:44.952609) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 4191624 bytes OK
Jan 26 10:17:44 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:44.952676) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Jan 26 10:17:44 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:44.954659) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Jan 26 10:17:44 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:44.954674) EVENT_LOG_v1 {"time_micros": 1769422664954669, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 10:17:44 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:44.954725) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 10:17:44 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 6502424, prev total WAL file size 6502424, number of live WAL files 2.
Jan 26 10:17:44 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:17:44 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:44.956686) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Jan 26 10:17:44 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 10:17:44 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(4093KB)], [60(11MB)]
Jan 26 10:17:44 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422664956769, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 16529384, "oldest_snapshot_seqno": -1}
Jan 26 10:17:45 compute-1 podman[238713]: 2026-01-26 10:17:45.055439048 +0000 UTC m=+0.085536202 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 10:17:45 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 6227 keys, 14392038 bytes, temperature: kUnknown
Jan 26 10:17:45 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422665071226, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 14392038, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14350719, "index_size": 24633, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15621, "raw_key_size": 159448, "raw_average_key_size": 25, "raw_value_size": 14238855, "raw_average_value_size": 2286, "num_data_blocks": 990, "num_entries": 6227, "num_filter_entries": 6227, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769422664, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:17:45 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:17:45 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:45.071521) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 14392038 bytes
Jan 26 10:17:45 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:45.073413) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 144.3 rd, 125.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 11.8 +0.0 blob) out(13.7 +0.0 blob), read-write-amplify(7.4) write-amplify(3.4) OK, records in: 6748, records dropped: 521 output_compression: NoCompression
Jan 26 10:17:45 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:45.073434) EVENT_LOG_v1 {"time_micros": 1769422665073424, "job": 36, "event": "compaction_finished", "compaction_time_micros": 114548, "compaction_time_cpu_micros": 36189, "output_level": 6, "num_output_files": 1, "total_output_size": 14392038, "num_input_records": 6748, "num_output_records": 6227, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 10:17:45 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:17:45 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422665074534, "job": 36, "event": "table_file_deletion", "file_number": 62}
Jan 26 10:17:45 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:17:45 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422665077584, "job": 36, "event": "table_file_deletion", "file_number": 60}
Jan 26 10:17:45 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:44.956622) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:17:45 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:45.077653) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:17:45 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:45.077657) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:17:45 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:45.077659) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:17:45 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:45.077661) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:17:45 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:45.077663) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:17:45 compute-1 nova_compute[226322]: 2026-01-26 10:17:45.100 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:17:45 compute-1 nova_compute[226322]: 2026-01-26 10:17:45.102 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4847MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:17:45 compute-1 nova_compute[226322]: 2026-01-26 10:17:45.102 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:17:45 compute-1 nova_compute[226322]: 2026-01-26 10:17:45.102 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:17:45 compute-1 nova_compute[226322]: 2026-01-26 10:17:45.111 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:45 compute-1 nova_compute[226322]: 2026-01-26 10:17:45.165 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:17:45 compute-1 nova_compute[226322]: 2026-01-26 10:17:45.165 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:17:45 compute-1 nova_compute[226322]: 2026-01-26 10:17:45.182 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing inventories for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 26 10:17:45 compute-1 nova_compute[226322]: 2026-01-26 10:17:45.243 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating ProviderTree inventory for provider d06842a0-5d13-4573-bb78-d433bbb380e4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 26 10:17:45 compute-1 nova_compute[226322]: 2026-01-26 10:17:45.243 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating inventory in ProviderTree for provider d06842a0-5d13-4573-bb78-d433bbb380e4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 10:17:45 compute-1 nova_compute[226322]: 2026-01-26 10:17:45.258 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing aggregate associations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 26 10:17:45 compute-1 nova_compute[226322]: 2026-01-26 10:17:45.283 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing trait associations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4, traits: HW_CPU_X86_CLMUL,HW_CPU_X86_SSE,HW_CPU_X86_AVX2,HW_CPU_X86_SHA,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_FMA3,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SVM,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 26 10:17:45 compute-1 nova_compute[226322]: 2026-01-26 10:17:45.313 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:17:45 compute-1 nova_compute[226322]: 2026-01-26 10:17:45.328 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:45 compute-1 sudo[238759]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:17:45 compute-1 sudo[238759]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:17:45 compute-1 sudo[238759]: pam_unix(sudo:session): session closed for user root
Jan 26 10:17:45 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:17:45 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/338352986' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:17:45 compute-1 nova_compute[226322]: 2026-01-26 10:17:45.754 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:17:45 compute-1 nova_compute[226322]: 2026-01-26 10:17:45.759 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:17:45 compute-1 nova_compute[226322]: 2026-01-26 10:17:45.772 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:17:45 compute-1 nova_compute[226322]: 2026-01-26 10:17:45.790 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:17:45 compute-1 nova_compute[226322]: 2026-01-26 10:17:45.791 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.688s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:17:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:45.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:45 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1005788338' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:17:45 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/338352986' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:17:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:17:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:45.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:17:46 compute-1 ceph-mon[80107]: pgmap v1067: 353 pgs: 353 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Jan 26 10:17:46 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3031763687' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:17:46 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3541673696' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 10:17:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:17:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:17:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:47.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:17:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:47.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:17:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:17:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:17:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:17:49 compute-1 ceph-mon[80107]: pgmap v1068: 353 pgs: 353 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Jan 26 10:17:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:17:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:17:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:49.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:17:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:49.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:50 compute-1 nova_compute[226322]: 2026-01-26 10:17:50.113 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:50 compute-1 nova_compute[226322]: 2026-01-26 10:17:50.330 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:50 compute-1 ceph-mon[80107]: pgmap v1069: 353 pgs: 353 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 1.5 MiB/s rd, 1.8 MiB/s wr, 87 op/s
Jan 26 10:17:50 compute-1 sshd-session[238788]: Invalid user admin from 152.42.136.110 port 39482
Jan 26 10:17:50 compute-1 sshd-session[238788]: Connection closed by invalid user admin 152.42.136.110 port 39482 [preauth]
Jan 26 10:17:50 compute-1 sudo[238791]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:17:50 compute-1 sudo[238791]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:17:50 compute-1 sudo[238791]: pam_unix(sudo:session): session closed for user root
Jan 26 10:17:51 compute-1 sudo[238816]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:17:51 compute-1 sudo[238816]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:17:51 compute-1 sudo[238816]: pam_unix(sudo:session): session closed for user root
Jan 26 10:17:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:51.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:51.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:17:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:17:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:17:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:17:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:17:53 compute-1 ceph-mon[80107]: pgmap v1070: 353 pgs: 353 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 1.5 MiB/s rd, 1.8 MiB/s wr, 87 op/s
Jan 26 10:17:53 compute-1 sshd-session[238873]: Invalid user centos from 104.248.198.71 port 41358
Jan 26 10:17:53 compute-1 sshd-session[238873]: Connection closed by invalid user centos 104.248.198.71 port 41358 [preauth]
Jan 26 10:17:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:17:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:53.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:17:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:17:53.941 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:17:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:17:53.942 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:17:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:17:53.942 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:17:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:54.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:54 compute-1 ceph-mon[80107]: pgmap v1071: 353 pgs: 353 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 1.5 MiB/s rd, 1.8 MiB/s wr, 87 op/s
Jan 26 10:17:55 compute-1 nova_compute[226322]: 2026-01-26 10:17:55.116 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:55 compute-1 nova_compute[226322]: 2026-01-26 10:17:55.332 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:17:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:55.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:56.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:56 compute-1 ceph-mon[80107]: pgmap v1072: 353 pgs: 353 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 102 op/s
Jan 26 10:17:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:17:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:17:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:57.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:17:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:17:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:17:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:17:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:17:58 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:17:58 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:17:58 compute-1 ceph-mon[80107]: pgmap v1073: 353 pgs: 353 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 26 10:17:58 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:17:58 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:17:58 compute-1 ceph-mon[80107]: pgmap v1074: 353 pgs: 353 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 15 KiB/s wr, 88 op/s
Jan 26 10:17:58 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:17:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/3940202117' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:17:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/3940202117' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:17:58 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:17:58 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:17:58 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:17:58 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:17:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:17:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:58.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:17:58 compute-1 sshd-session[238877]: Connection closed by 87.236.176.202 port 57973
Jan 26 10:17:58 compute-1 sshd-session[238879]: Connection closed by 87.236.176.202 port 53949 [preauth]
Jan 26 10:17:58 compute-1 ovn_controller[133670]: 2026-01-26T10:17:58Z|00075|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Jan 26 10:17:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:17:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:17:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:59.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:18:00 compute-1 nova_compute[226322]: 2026-01-26 10:18:00.118 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:18:00 compute-1 ceph-mon[80107]: pgmap v1075: 353 pgs: 353 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 572 KiB/s rd, 22 op/s
Jan 26 10:18:00 compute-1 nova_compute[226322]: 2026-01-26 10:18:00.333 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:18:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:00.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:00 compute-1 sshd-session[238881]: Invalid user admin from 178.62.249.31 port 43392
Jan 26 10:18:00 compute-1 podman[238884]: 2026-01-26 10:18:00.812620916 +0000 UTC m=+0.057975967 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 26 10:18:01 compute-1 sshd-session[238881]: Connection closed by invalid user admin 178.62.249.31 port 43392 [preauth]
Jan 26 10:18:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:01.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:18:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:02.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:18:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:18:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:18:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:18:03 compute-1 ceph-mon[80107]: pgmap v1076: 353 pgs: 353 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 572 KiB/s rd, 22 op/s
Jan 26 10:18:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:18:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:03.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:18:03 compute-1 sudo[238906]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:18:03 compute-1 sudo[238906]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:18:03 compute-1 sudo[238906]: pam_unix(sudo:session): session closed for user root
Jan 26 10:18:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:18:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:04.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:18:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:18:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:18:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:18:04 compute-1 ceph-mon[80107]: pgmap v1077: 353 pgs: 353 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 572 KiB/s rd, 22 op/s
Jan 26 10:18:05 compute-1 nova_compute[226322]: 2026-01-26 10:18:05.119 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:18:05 compute-1 nova_compute[226322]: 2026-01-26 10:18:05.335 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:18:05 compute-1 sudo[238931]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:18:05 compute-1 sudo[238931]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:18:05 compute-1 sudo[238931]: pam_unix(sudo:session): session closed for user root
Jan 26 10:18:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:18:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:05.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:18:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:06.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:07 compute-1 ceph-mon[80107]: pgmap v1078: 353 pgs: 353 active+clean; 121 MiB data, 318 MiB used, 60 GiB / 60 GiB avail; 390 KiB/s rd, 2.5 MiB/s wr, 77 op/s
Jan 26 10:18:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:18:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:07.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:18:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:18:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:18:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:18:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:18:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:08.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:18:08 compute-1 ceph-mon[80107]: pgmap v1079: 353 pgs: 353 active+clean; 121 MiB data, 318 MiB used, 60 GiB / 60 GiB avail; 390 KiB/s rd, 2.5 MiB/s wr, 77 op/s
Jan 26 10:18:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:18:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:09.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:18:10 compute-1 nova_compute[226322]: 2026-01-26 10:18:10.122 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:18:10 compute-1 ceph-mon[80107]: pgmap v1080: 353 pgs: 353 active+clean; 121 MiB data, 318 MiB used, 60 GiB / 60 GiB avail; 329 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Jan 26 10:18:10 compute-1 nova_compute[226322]: 2026-01-26 10:18:10.337 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:18:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:10.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:18:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:11.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:18:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:12.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:12 compute-1 ceph-mon[80107]: pgmap v1081: 353 pgs: 353 active+clean; 121 MiB data, 318 MiB used, 60 GiB / 60 GiB avail; 311 KiB/s rd, 2.1 MiB/s wr, 61 op/s
Jan 26 10:18:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:18:12 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:18:12.552 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 10:18:12 compute-1 nova_compute[226322]: 2026-01-26 10:18:12.554 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:18:12 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:18:12.554 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 10:18:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:18:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:18:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:18:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:18:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:13.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:14 compute-1 ceph-mon[80107]: pgmap v1082: 353 pgs: 353 active+clean; 121 MiB data, 318 MiB used, 60 GiB / 60 GiB avail; 311 KiB/s rd, 2.1 MiB/s wr, 61 op/s
Jan 26 10:18:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:14.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:15 compute-1 nova_compute[226322]: 2026-01-26 10:18:15.125 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:18:15 compute-1 podman[238961]: 2026-01-26 10:18:15.321732972 +0000 UTC m=+0.105719991 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 10:18:15 compute-1 nova_compute[226322]: 2026-01-26 10:18:15.339 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:18:15 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:18:15.556 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:18:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:15.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:16 compute-1 ceph-mon[80107]: pgmap v1083: 353 pgs: 353 active+clean; 121 MiB data, 318 MiB used, 60 GiB / 60 GiB avail; 315 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Jan 26 10:18:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:16.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:17 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:18:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:17.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:18:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:18:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:18:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:18:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:18.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:19 compute-1 ceph-mon[80107]: pgmap v1084: 353 pgs: 353 active+clean; 121 MiB data, 318 MiB used, 60 GiB / 60 GiB avail; 4.4 KiB/s rd, 12 KiB/s wr, 4 op/s
Jan 26 10:18:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:18:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:18:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:19.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:18:20 compute-1 nova_compute[226322]: 2026-01-26 10:18:20.127 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:18:20 compute-1 ceph-mon[80107]: pgmap v1085: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 13 KiB/s wr, 29 op/s
Jan 26 10:18:20 compute-1 nova_compute[226322]: 2026-01-26 10:18:20.341 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:18:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:18:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:20.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:18:21 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1394289376' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:18:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:18:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:21.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:18:22 compute-1 ceph-mon[80107]: pgmap v1086: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Jan 26 10:18:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:22.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:18:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:18:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:18:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:18:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:18:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:18:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:23.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:18:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:18:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:24.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:18:25 compute-1 ceph-mon[80107]: pgmap v1087: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Jan 26 10:18:25 compute-1 nova_compute[226322]: 2026-01-26 10:18:25.128 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:18:25 compute-1 nova_compute[226322]: 2026-01-26 10:18:25.342 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:18:25 compute-1 sudo[238995]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:18:25 compute-1 sudo[238995]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:18:25 compute-1 sudo[238995]: pam_unix(sudo:session): session closed for user root
Jan 26 10:18:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:25.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:26 compute-1 ceph-mon[80107]: pgmap v1088: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 29 op/s
Jan 26 10:18:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:26.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:18:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:18:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:27.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:18:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:18:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:18:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:18:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:18:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:18:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:28.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:18:29 compute-1 ceph-mon[80107]: pgmap v1089: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.2 KiB/s wr, 25 op/s
Jan 26 10:18:29 compute-1 sshd-session[239021]: Invalid user admin from 152.42.136.110 port 33994
Jan 26 10:18:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:29.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:30 compute-1 nova_compute[226322]: 2026-01-26 10:18:30.131 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:18:30 compute-1 sshd-session[239021]: Connection closed by invalid user admin 152.42.136.110 port 33994 [preauth]
Jan 26 10:18:30 compute-1 nova_compute[226322]: 2026-01-26 10:18:30.344 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:18:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:30.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:31 compute-1 podman[239024]: 2026-01-26 10:18:31.292806041 +0000 UTC m=+0.075999268 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 26 10:18:31 compute-1 ceph-mon[80107]: pgmap v1090: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.2 KiB/s wr, 25 op/s
Jan 26 10:18:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:18:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:31.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:18:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:32.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:18:32 compute-1 ceph-mon[80107]: pgmap v1091: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:18:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:18:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:18:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:18:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:18:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:18:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:33.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:34.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:34 compute-1 ceph-mon[80107]: pgmap v1092: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:18:34 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 26 10:18:35 compute-1 nova_compute[226322]: 2026-01-26 10:18:35.133 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:18:35 compute-1 nova_compute[226322]: 2026-01-26 10:18:35.346 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:18:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:35.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:36.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:36 compute-1 nova_compute[226322]: 2026-01-26 10:18:36.792 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:18:36 compute-1 nova_compute[226322]: 2026-01-26 10:18:36.792 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:18:37 compute-1 ceph-mon[80107]: pgmap v1093: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:18:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:18:37 compute-1 nova_compute[226322]: 2026-01-26 10:18:37.682 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:18:37 compute-1 nova_compute[226322]: 2026-01-26 10:18:37.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:18:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:37.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:18:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:18:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:18:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:18:38 compute-1 ceph-mon[80107]: pgmap v1094: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:18:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:38.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:38 compute-1 sshd-session[239048]: Invalid user centos from 104.248.198.71 port 54056
Jan 26 10:18:38 compute-1 sshd-session[239048]: Connection closed by invalid user centos 104.248.198.71 port 54056 [preauth]
Jan 26 10:18:38 compute-1 nova_compute[226322]: 2026-01-26 10:18:38.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:18:38 compute-1 nova_compute[226322]: 2026-01-26 10:18:38.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:18:39 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2256897382' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:18:39 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/793596853' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:18:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:18:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:39.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:18:40 compute-1 nova_compute[226322]: 2026-01-26 10:18:40.135 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:18:40 compute-1 ceph-mon[80107]: pgmap v1095: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:18:40 compute-1 nova_compute[226322]: 2026-01-26 10:18:40.348 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:18:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:18:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:40.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:18:40 compute-1 nova_compute[226322]: 2026-01-26 10:18:40.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:18:40 compute-1 nova_compute[226322]: 2026-01-26 10:18:40.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:18:40 compute-1 nova_compute[226322]: 2026-01-26 10:18:40.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:18:40 compute-1 nova_compute[226322]: 2026-01-26 10:18:40.714 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 10:18:40 compute-1 nova_compute[226322]: 2026-01-26 10:18:40.715 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:18:41 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2704312734' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:18:41 compute-1 nova_compute[226322]: 2026-01-26 10:18:41.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:18:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:41.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:42 compute-1 ceph-mon[80107]: pgmap v1096: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:18:42 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2708101103' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:18:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:18:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:42.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:18:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:18:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:18:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:18:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:18:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:18:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:18:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:43.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:18:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:18:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:44.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:18:45 compute-1 ceph-mon[80107]: pgmap v1097: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:18:45 compute-1 nova_compute[226322]: 2026-01-26 10:18:45.139 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:18:45 compute-1 nova_compute[226322]: 2026-01-26 10:18:45.350 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:18:45 compute-1 sshd-session[239053]: Invalid user admin from 178.62.249.31 port 50866
Jan 26 10:18:45 compute-1 podman[239055]: 2026-01-26 10:18:45.488910271 +0000 UTC m=+0.116889941 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 10:18:45 compute-1 nova_compute[226322]: 2026-01-26 10:18:45.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:18:45 compute-1 nova_compute[226322]: 2026-01-26 10:18:45.715 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:18:45 compute-1 nova_compute[226322]: 2026-01-26 10:18:45.716 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:18:45 compute-1 nova_compute[226322]: 2026-01-26 10:18:45.716 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:18:45 compute-1 nova_compute[226322]: 2026-01-26 10:18:45.717 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:18:45 compute-1 nova_compute[226322]: 2026-01-26 10:18:45.717 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:18:45 compute-1 sshd-session[239053]: Connection closed by invalid user admin 178.62.249.31 port 50866 [preauth]
Jan 26 10:18:45 compute-1 sudo[239083]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:18:45 compute-1 sudo[239083]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:18:45 compute-1 sudo[239083]: pam_unix(sudo:session): session closed for user root
Jan 26 10:18:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:18:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:45.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:18:46 compute-1 ceph-mon[80107]: pgmap v1098: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:18:46 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:18:46 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3089679829' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:18:46 compute-1 nova_compute[226322]: 2026-01-26 10:18:46.186 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:18:46 compute-1 nova_compute[226322]: 2026-01-26 10:18:46.343 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:18:46 compute-1 nova_compute[226322]: 2026-01-26 10:18:46.344 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4870MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:18:46 compute-1 nova_compute[226322]: 2026-01-26 10:18:46.344 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:18:46 compute-1 nova_compute[226322]: 2026-01-26 10:18:46.345 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:18:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:46.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:46 compute-1 nova_compute[226322]: 2026-01-26 10:18:46.513 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:18:46 compute-1 nova_compute[226322]: 2026-01-26 10:18:46.514 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:18:46 compute-1 nova_compute[226322]: 2026-01-26 10:18:46.532 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:18:46 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:18:46 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3843976593' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:18:46 compute-1 nova_compute[226322]: 2026-01-26 10:18:46.981 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:18:46 compute-1 nova_compute[226322]: 2026-01-26 10:18:46.986 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:18:47 compute-1 nova_compute[226322]: 2026-01-26 10:18:47.010 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:18:47 compute-1 nova_compute[226322]: 2026-01-26 10:18:47.012 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:18:47 compute-1 nova_compute[226322]: 2026-01-26 10:18:47.012 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:18:47 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3089679829' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:18:47 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3843976593' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:18:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:18:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:47.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:18:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:18:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:18:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:18:48 compute-1 ceph-mon[80107]: pgmap v1099: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:18:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:18:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:48.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:18:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:18:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:49.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:50 compute-1 nova_compute[226322]: 2026-01-26 10:18:50.139 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:18:50 compute-1 ceph-mon[80107]: pgmap v1100: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:18:50 compute-1 nova_compute[226322]: 2026-01-26 10:18:50.352 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:18:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:18:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:50.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:18:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:52.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:52 compute-1 ceph-mon[80107]: pgmap v1101: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:18:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:52.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:18:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:18:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:18:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:18:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:18:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:18:53.942 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:18:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:18:53.943 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:18:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:18:53.943 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:18:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:54.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:54.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:55 compute-1 ceph-mon[80107]: pgmap v1102: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:18:55 compute-1 nova_compute[226322]: 2026-01-26 10:18:55.146 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:18:55 compute-1 nova_compute[226322]: 2026-01-26 10:18:55.412 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:18:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:56.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:56 compute-1 ceph-mon[80107]: pgmap v1103: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:18:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:56.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:18:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:18:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:18:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:18:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:18:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:58.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 10:18:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1448684425' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:18:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 10:18:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1448684425' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:18:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:18:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:18:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:58.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:18:59 compute-1 ceph-mon[80107]: pgmap v1104: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:18:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/1448684425' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:18:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/1448684425' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:19:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:19:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:00.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:19:00 compute-1 ceph-mon[80107]: pgmap v1105: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:19:00 compute-1 nova_compute[226322]: 2026-01-26 10:19:00.149 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:19:00 compute-1 nova_compute[226322]: 2026-01-26 10:19:00.414 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:19:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:19:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:00.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:19:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:02.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:02 compute-1 podman[239159]: 2026-01-26 10:19:02.275109933 +0000 UTC m=+0.049689178 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Jan 26 10:19:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:02.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:19:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:19:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:19:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:19:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:19:03 compute-1 ceph-mon[80107]: pgmap v1106: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:19:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:04.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:19:04 compute-1 ceph-mon[80107]: pgmap v1107: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:19:04 compute-1 sudo[239179]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:19:04 compute-1 sudo[239179]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:19:04 compute-1 sudo[239179]: pam_unix(sudo:session): session closed for user root
Jan 26 10:19:04 compute-1 sudo[239204]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:19:04 compute-1 sudo[239204]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:19:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:04.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:04 compute-1 sudo[239204]: pam_unix(sudo:session): session closed for user root
Jan 26 10:19:05 compute-1 nova_compute[226322]: 2026-01-26 10:19:05.175 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:19:05 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:19:05 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:19:05 compute-1 ceph-mon[80107]: pgmap v1108: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 1 op/s
Jan 26 10:19:05 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:19:05 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:19:05 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:19:05 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:19:05 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:19:05 compute-1 nova_compute[226322]: 2026-01-26 10:19:05.415 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:19:05 compute-1 sudo[239261]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:19:05 compute-1 sudo[239261]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:19:05 compute-1 sudo[239261]: pam_unix(sudo:session): session closed for user root
Jan 26 10:19:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:06.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:19:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:06.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:19:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:19:07 compute-1 ceph-mon[80107]: pgmap v1109: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 1 op/s
Jan 26 10:19:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:19:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:19:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:19:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:19:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:08.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:08.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:09 compute-1 sshd-session[239287]: Invalid user admin from 152.42.136.110 port 36454
Jan 26 10:19:09 compute-1 sshd-session[239287]: Connection closed by invalid user admin 152.42.136.110 port 36454 [preauth]
Jan 26 10:19:09 compute-1 sudo[239290]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:19:09 compute-1 sudo[239290]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:19:09 compute-1 sudo[239290]: pam_unix(sudo:session): session closed for user root
Jan 26 10:19:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:19:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:10.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:19:10 compute-1 ceph-mon[80107]: pgmap v1110: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 1 op/s
Jan 26 10:19:10 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:19:10 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:19:10 compute-1 nova_compute[226322]: 2026-01-26 10:19:10.176 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:19:10 compute-1 nova_compute[226322]: 2026-01-26 10:19:10.417 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:19:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:10.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:12.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:12 compute-1 ceph-mon[80107]: pgmap v1111: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 1 op/s
Jan 26 10:19:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:19:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:12.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:19:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:19:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:19:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:19:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:19:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:19:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:19:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:14.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:19:14 compute-1 ceph-mon[80107]: pgmap v1112: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 1 op/s
Jan 26 10:19:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:19:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:14.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.548622) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422754548673, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 1162, "num_deletes": 250, "total_data_size": 2809440, "memory_usage": 2841704, "flush_reason": "Manual Compaction"}
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422754565155, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 1805113, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33626, "largest_seqno": 34783, "table_properties": {"data_size": 1799984, "index_size": 2589, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 10304, "raw_average_key_size": 18, "raw_value_size": 1789742, "raw_average_value_size": 3156, "num_data_blocks": 114, "num_entries": 567, "num_filter_entries": 567, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769422665, "oldest_key_time": 1769422665, "file_creation_time": 1769422754, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 16591 microseconds, and 7245 cpu microseconds.
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.565213) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 1805113 bytes OK
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.565243) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.567775) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.567797) EVENT_LOG_v1 {"time_micros": 1769422754567790, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.567837) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 2803804, prev total WAL file size 2803804, number of live WAL files 2.
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.568961) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323532' seq:72057594037927935, type:22 .. '6B7600353033' seq:0, type:0; will stop at (end)
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(1762KB)], [63(13MB)]
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422754569024, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 16197151, "oldest_snapshot_seqno": -1}
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 6280 keys, 14950493 bytes, temperature: kUnknown
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422754703944, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 14950493, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14908118, "index_size": 25561, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15749, "raw_key_size": 162279, "raw_average_key_size": 25, "raw_value_size": 14794509, "raw_average_value_size": 2355, "num_data_blocks": 1015, "num_entries": 6280, "num_filter_entries": 6280, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769422754, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.704272) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 14950493 bytes
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.722273) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 120.0 rd, 110.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 13.7 +0.0 blob) out(14.3 +0.0 blob), read-write-amplify(17.3) write-amplify(8.3) OK, records in: 6794, records dropped: 514 output_compression: NoCompression
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.722322) EVENT_LOG_v1 {"time_micros": 1769422754722303, "job": 38, "event": "compaction_finished", "compaction_time_micros": 135026, "compaction_time_cpu_micros": 41453, "output_level": 6, "num_output_files": 1, "total_output_size": 14950493, "num_input_records": 6794, "num_output_records": 6280, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422754723177, "job": 38, "event": "table_file_deletion", "file_number": 65}
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422754727852, "job": 38, "event": "table_file_deletion", "file_number": 63}
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.568869) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.727900) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.727905) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.727908) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.727911) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:19:14 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.727913) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:19:15 compute-1 nova_compute[226322]: 2026-01-26 10:19:15.264 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:19:15 compute-1 nova_compute[226322]: 2026-01-26 10:19:15.418 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:19:15 compute-1 ceph-mon[80107]: pgmap v1113: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 1 op/s
Jan 26 10:19:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:19:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:16.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:19:16 compute-1 podman[239318]: 2026-01-26 10:19:16.316496945 +0000 UTC m=+0.098936904 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 26 10:19:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:16.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:17 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:19:17 compute-1 ceph-mon[80107]: pgmap v1114: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:19:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:19:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:19:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:19:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:19:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:18.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:18.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:19:19 compute-1 ceph-mon[80107]: pgmap v1115: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:19:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:20.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:20 compute-1 nova_compute[226322]: 2026-01-26 10:19:20.265 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:19:20 compute-1 nova_compute[226322]: 2026-01-26 10:19:20.420 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:19:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:20.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:22 compute-1 ceph-mon[80107]: pgmap v1116: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:19:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:22.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:22.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:19:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:19:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:19:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:19:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:19:23 compute-1 sshd-session[239347]: Accepted publickey for zuul from 192.168.122.10 port 60902 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 10:19:23 compute-1 systemd-logind[783]: New session 55 of user zuul.
Jan 26 10:19:23 compute-1 systemd[1]: Started Session 55 of User zuul.
Jan 26 10:19:23 compute-1 sshd-session[239347]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 10:19:23 compute-1 sudo[239351]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Jan 26 10:19:23 compute-1 sudo[239351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:19:24 compute-1 ceph-mon[80107]: pgmap v1117: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:19:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:19:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:24.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:19:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:24.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:25 compute-1 nova_compute[226322]: 2026-01-26 10:19:25.313 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:19:25 compute-1 nova_compute[226322]: 2026-01-26 10:19:25.422 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:19:26 compute-1 sudo[239532]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:19:26 compute-1 sudo[239532]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:19:26 compute-1 sudo[239532]: pam_unix(sudo:session): session closed for user root
Jan 26 10:19:26 compute-1 ceph-mon[80107]: pgmap v1118: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:19:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:19:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:26.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:19:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:19:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:26.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:19:26 compute-1 sshd-session[239584]: Invalid user centos from 104.248.198.71 port 33246
Jan 26 10:19:26 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Jan 26 10:19:26 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4108331622' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 26 10:19:26 compute-1 sshd-session[239584]: Connection closed by invalid user centos 104.248.198.71 port 33246 [preauth]
Jan 26 10:19:27 compute-1 ceph-mon[80107]: from='client.26147 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:27 compute-1 ceph-mon[80107]: from='client.16542 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:27 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/4108331622' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 26 10:19:27 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2457933205' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 26 10:19:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:19:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:19:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:19:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:19:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:19:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:28.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:28 compute-1 ceph-mon[80107]: from='client.25795 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:28 compute-1 ceph-mon[80107]: from='client.26162 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:28 compute-1 ceph-mon[80107]: from='client.16554 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:28 compute-1 ceph-mon[80107]: pgmap v1119: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:19:28 compute-1 ceph-mon[80107]: from='client.25810 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:28 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2766876846' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 26 10:19:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:19:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:28.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:19:28 compute-1 sshd-session[239679]: Invalid user admin from 178.62.249.31 port 46528
Jan 26 10:19:29 compute-1 sshd-session[239679]: Connection closed by invalid user admin 178.62.249.31 port 46528 [preauth]
Jan 26 10:19:29 compute-1 ceph-mon[80107]: pgmap v1120: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:19:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:19:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:30.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:19:30 compute-1 nova_compute[226322]: 2026-01-26 10:19:30.315 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:19:30 compute-1 nova_compute[226322]: 2026-01-26 10:19:30.423 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:19:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:30.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:31 compute-1 ovs-vsctl[239748]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 26 10:19:31 compute-1 ceph-mon[80107]: pgmap v1121: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:19:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:19:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:32.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:19:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:19:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:32.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:19:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:19:32 compute-1 virtqemud[225791]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 26 10:19:32 compute-1 virtqemud[225791]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 26 10:19:32 compute-1 virtqemud[225791]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 26 10:19:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:19:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:19:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:19:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:19:33 compute-1 podman[239967]: 2026-01-26 10:19:33.28656198 +0000 UTC m=+0.067347138 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 10:19:33 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: cache status {prefix=cache status} (starting...)
Jan 26 10:19:33 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 10:19:33 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: client ls {prefix=client ls} (starting...)
Jan 26 10:19:33 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 10:19:33 compute-1 lvm[240120]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 26 10:19:33 compute-1 lvm[240120]: VG ceph_vg0 finished
Jan 26 10:19:34 compute-1 ceph-mon[80107]: pgmap v1122: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:19:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:19:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:19:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:34.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:19:34 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: damage ls {prefix=damage ls} (starting...)
Jan 26 10:19:34 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 10:19:34 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: dump loads {prefix=dump loads} (starting...)
Jan 26 10:19:34 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 10:19:34 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Jan 26 10:19:34 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1630855317' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 26 10:19:34 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Jan 26 10:19:34 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 10:19:34 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Jan 26 10:19:34 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 10:19:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:19:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:34.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:19:34 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Jan 26 10:19:34 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 10:19:34 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Jan 26 10:19:34 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1100227087' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:19:34 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Jan 26 10:19:34 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 10:19:35 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Jan 26 10:19:35 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 10:19:35 compute-1 ceph-mon[80107]: from='client.26186 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:35 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1630855317' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 26 10:19:35 compute-1 ceph-mon[80107]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 26 10:19:35 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3883529240' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 26 10:19:35 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1100227087' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:19:35 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1218341045' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:19:35 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Jan 26 10:19:35 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2569982764' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 26 10:19:35 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: get subtrees {prefix=get subtrees} (starting...)
Jan 26 10:19:35 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 10:19:35 compute-1 nova_compute[226322]: 2026-01-26 10:19:35.318 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:19:35 compute-1 nova_compute[226322]: 2026-01-26 10:19:35.426 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:19:35 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: ops {prefix=ops} (starting...)
Jan 26 10:19:35 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 10:19:35 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Jan 26 10:19:35 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2035422281' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 26 10:19:35 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Jan 26 10:19:35 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/109052100' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 26 10:19:35 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 26 10:19:35 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2878709684' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 26 10:19:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:36.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:36 compute-1 ceph-mon[80107]: from='client.16578 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:36 compute-1 ceph-mon[80107]: from='client.26198 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:36 compute-1 ceph-mon[80107]: from='client.16593 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:36 compute-1 ceph-mon[80107]: from='client.26204 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:36 compute-1 ceph-mon[80107]: pgmap v1123: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:19:36 compute-1 ceph-mon[80107]: from='client.16605 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:36 compute-1 ceph-mon[80107]: from='client.26219 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:36 compute-1 ceph-mon[80107]: from='client.25840 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:36 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2569982764' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 26 10:19:36 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2585676288' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 26 10:19:36 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1939179752' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 26 10:19:36 compute-1 ceph-mon[80107]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 26 10:19:36 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2035422281' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 26 10:19:36 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/109052100' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 26 10:19:36 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1817716702' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 26 10:19:36 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3839696754' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:19:36 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2767220968' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 26 10:19:36 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2878709684' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 26 10:19:36 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: session ls {prefix=session ls} (starting...)
Jan 26 10:19:36 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 10:19:36 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: status {prefix=status} (starting...)
Jan 26 10:19:36 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 26 10:19:36 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3164859405' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 26 10:19:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:36.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:36 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 26 10:19:36 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1762731804' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 26 10:19:36 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Jan 26 10:19:36 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1146455129' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 26 10:19:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 26 10:19:37 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3257461002' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 26 10:19:37 compute-1 ceph-mon[80107]: from='client.16620 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:37 compute-1 ceph-mon[80107]: from='client.25855 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:37 compute-1 ceph-mon[80107]: from='client.25867 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:37 compute-1 ceph-mon[80107]: from='client.26258 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:37 compute-1 ceph-mon[80107]: from='client.16638 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:37 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/355643123' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 26 10:19:37 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/4039591914' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 26 10:19:37 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3164859405' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 26 10:19:37 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2820007' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 26 10:19:37 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3683707489' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 26 10:19:37 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2871500887' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 26 10:19:37 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1762731804' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 26 10:19:37 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1146455129' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 26 10:19:37 compute-1 ceph-mon[80107]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 26 10:19:37 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1174436668' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 26 10:19:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 26 10:19:37 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/130987033' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 26 10:19:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Jan 26 10:19:37 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/655899828' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 26 10:19:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:19:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Jan 26 10:19:37 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3972199707' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 26 10:19:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:19:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:19:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:19:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:19:38 compute-1 nova_compute[226322]: 2026-01-26 10:19:38.012 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:19:38 compute-1 nova_compute[226322]: 2026-01-26 10:19:38.013 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:19:38 compute-1 nova_compute[226322]: 2026-01-26 10:19:38.013 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:19:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:38.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:38 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 26 10:19:38 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4231684812' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 26 10:19:38 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Jan 26 10:19:38 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2147534676' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 26 10:19:38 compute-1 ceph-mon[80107]: from='client.25876 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:38 compute-1 ceph-mon[80107]: from='client.26276 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:38 compute-1 ceph-mon[80107]: from='client.16653 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:38 compute-1 ceph-mon[80107]: pgmap v1124: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:19:38 compute-1 ceph-mon[80107]: from='client.25921 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:38 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3257461002' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 26 10:19:38 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1508688286' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 26 10:19:38 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/130987033' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 26 10:19:38 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/655899828' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 26 10:19:38 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/637312991' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 26 10:19:38 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1056870647' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 26 10:19:38 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3972199707' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 26 10:19:38 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2156067595' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 26 10:19:38 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3912734262' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 26 10:19:38 compute-1 ceph-mon[80107]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 26 10:19:38 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/4235847919' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 26 10:19:38 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Jan 26 10:19:38 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/325511667' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 26 10:19:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:19:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:38.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:19:38 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Jan 26 10:19:38 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3812039934' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 26 10:19:38 compute-1 nova_compute[226322]: 2026-01-26 10:19:38.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:19:38 compute-1 nova_compute[226322]: 2026-01-26 10:19:38.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:19:38 compute-1 nova_compute[226322]: 2026-01-26 10:19:38.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:19:39 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 26 10:19:39 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4000464440' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 26 10:19:39 compute-1 ceph-mon[80107]: from='client.25936 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:39 compute-1 ceph-mon[80107]: from='client.26330 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:39 compute-1 ceph-mon[80107]: from='client.16698 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:39 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/4231684812' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 26 10:19:39 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2147534676' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 26 10:19:39 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/4176106981' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 26 10:19:39 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/325511667' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 26 10:19:39 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1990822071' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 26 10:19:39 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2903306104' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 26 10:19:39 compute-1 ceph-mon[80107]: from='client.26366 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:39 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/779557225' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:19:39 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3812039934' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 26 10:19:39 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1117934623' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 26 10:19:39 compute-1 ceph-mon[80107]: from='client.25990 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:39 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/4165636823' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 26 10:19:39 compute-1 ceph-mon[80107]: pgmap v1125: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:19:39 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3940556563' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 26 10:19:39 compute-1 ceph-mon[80107]: from='client.26390 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:39 compute-1 ceph-mon[80107]: from='client.16755 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:39 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/4000464440' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 26 10:19:39 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1421307840' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 26 10:19:39 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 26 10:19:39 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1848180643' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 26 10:19:39 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Jan 26 10:19:39 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/136388260' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a800
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 737280 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:46:57.040016+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73596928 unmapped: 729088 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:46:58.040390+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927696 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73596928 unmapped: 729088 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:46:59.040560+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73605120 unmapped: 720896 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:00.040789+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d172495400 session 0x55d16feee000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 712704 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:01.041237+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 712704 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:02.041401+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 704512 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:03.042749+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927696 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 704512 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:04.043924+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 704512 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:05.044092+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73629696 unmapped: 696320 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.554183006s of 12.558165550s, submitted: 1
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172162400
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:06.044320+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73629696 unmapped: 696320 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:07.044565+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73629696 unmapped: 696320 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:08.045555+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927105 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73637888 unmapped: 688128 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:09.045950+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73637888 unmapped: 688128 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:10.046097+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 679936 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:11.046533+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 679936 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:12.046692+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 671744 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:13.047241+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927105 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73662464 unmapped: 663552 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:14.047638+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73662464 unmapped: 663552 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:15.048094+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 655360 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:16.048347+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172163800
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.911679268s of 10.922836304s, submitted: 3
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 655360 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:17.048528+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 655360 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:18.048784+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930129 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73678848 unmapped: 647168 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:19.048992+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73678848 unmapped: 647168 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:20.049130+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73687040 unmapped: 638976 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:21.049263+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73687040 unmapped: 638976 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:22.049432+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73687040 unmapped: 638976 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:23.049588+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930129 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73695232 unmapped: 630784 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:24.049817+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 622592 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:25.050000+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 622592 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:26.050205+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73711616 unmapped: 614400 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:27.050400+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 606208 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:28.050547+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930129 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73728000 unmapped: 598016 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d172162400 session 0x55d172658b40
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:29.050746+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73728000 unmapped: 598016 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:30.050957+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73728000 unmapped: 598016 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:31.051105+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 589824 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:32.051229+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 589824 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:33.051389+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930129 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 589824 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:34.051601+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73744384 unmapped: 581632 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:35.051831+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73744384 unmapped: 581632 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:36.052052+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 573440 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:37.052191+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 573440 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:38.052417+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930129 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 573440 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:39.052604+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 565248 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:40.052761+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 565248 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:41.052877+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73768960 unmapped: 557056 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:42.053028+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73768960 unmapped: 557056 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:43.053213+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930129 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73768960 unmapped: 557056 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:44.053407+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73785344 unmapped: 540672 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:45.053637+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73785344 unmapped: 540672 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:46.053875+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73793536 unmapped: 532480 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:47.053994+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73793536 unmapped: 532480 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:48.054159+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930129 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 32.089416504s of 32.094387054s, submitted: 1
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73793536 unmapped: 532480 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:49.054295+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73801728 unmapped: 524288 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:50.054487+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73801728 unmapped: 524288 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:51.054626+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73809920 unmapped: 516096 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:52.054822+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73809920 unmapped: 516096 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:53.054999+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928947 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73809920 unmapped: 516096 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:54.055167+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73818112 unmapped: 507904 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:55.055331+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73818112 unmapped: 507904 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:56.055495+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 499712 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:57.055665+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 499712 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:58.055943+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d172163800 session 0x55d172e80780
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928947 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 499712 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:47:59.056115+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73834496 unmapped: 491520 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:00.056252+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73834496 unmapped: 491520 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:01.056394+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:02.056541+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 483328 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:03.056694+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 483328 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928947 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:04.056901+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 483328 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:05.057099+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73850880 unmapped: 475136 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:06.057316+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73850880 unmapped: 475136 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:07.057589+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73850880 unmapped: 475136 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:08.057758+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 466944 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928947 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:09.057927+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 466944 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:10.058075+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73875456 unmapped: 450560 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:11.058239+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73875456 unmapped: 450560 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:12.058380+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73875456 unmapped: 450560 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73875456 unmapped: 450560 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:13.269432+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928947 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73883648 unmapped: 442368 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:14.269562+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73883648 unmapped: 442368 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:15.269689+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 434176 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:16.269934+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 434176 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:17.270148+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 434176 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:18.270383+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928947 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73900032 unmapped: 425984 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:19.270586+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73900032 unmapped: 425984 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:20.270772+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 417792 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:21.271023+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 417792 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:22.271195+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 417792 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:23.271431+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928947 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73916416 unmapped: 409600 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:24.271609+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73916416 unmapped: 409600 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:25.271830+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73916416 unmapped: 409600 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:26.272017+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 401408 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:27.272167+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73932800 unmapped: 393216 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:28.272355+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928947 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 385024 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:29.272583+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73949184 unmapped: 376832 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:30.272773+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 41.662242889s of 41.667964935s, submitted: 2
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73949184 unmapped: 376832 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:31.272944+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73957376 unmapped: 368640 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:32.273079+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73957376 unmapped: 368640 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:33.273227+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 360448 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:34.273463+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 360448 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:35.273603+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 360448 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:36.273768+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 352256 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:37.273975+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 352256 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:38.274118+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 352256 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:39.274247+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73981952 unmapped: 344064 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:40.274372+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73981952 unmapped: 344064 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:41.274542+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 335872 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:42.274745+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 335872 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:43.274870+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 327680 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:44.274953+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 327680 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:45.275120+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 327680 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:46.275299+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 319488 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:47.275419+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 319488 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:48.275581+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 319488 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:49.275786+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74022912 unmapped: 303104 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:50.276089+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74022912 unmapped: 303104 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:51.276242+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74031104 unmapped: 294912 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:52.276392+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74031104 unmapped: 294912 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:53.276537+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74031104 unmapped: 294912 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:54.276774+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74047488 unmapped: 278528 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:55.276889+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74047488 unmapped: 278528 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:56.277026+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74055680 unmapped: 270336 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:57.277143+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74055680 unmapped: 270336 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:58.277285+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74055680 unmapped: 270336 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:48:59.277440+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74063872 unmapped: 262144 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:00.277598+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74063872 unmapped: 262144 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:01.277805+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74072064 unmapped: 253952 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:02.277935+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74072064 unmapped: 253952 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:03.278088+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74072064 unmapped: 253952 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:04.278233+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74080256 unmapped: 245760 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:05.278400+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:06.278665+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74080256 unmapped: 245760 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:07.278870+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74080256 unmapped: 245760 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17042a800 session 0x55d172efef00
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:08.279050+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74088448 unmapped: 237568 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:09.279340+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74088448 unmapped: 237568 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:10.279556+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74096640 unmapped: 229376 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:11.279759+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74096640 unmapped: 229376 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:12.280100+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74104832 unmapped: 221184 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:13.280262+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74113024 unmapped: 212992 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:14.280419+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74121216 unmapped: 204800 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:15.280577+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74129408 unmapped: 196608 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:16.280799+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74129408 unmapped: 196608 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:17.280964+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74129408 unmapped: 196608 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:18.281151+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74137600 unmapped: 188416 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:19.281329+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74137600 unmapped: 188416 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:20.281485+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74145792 unmapped: 180224 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:21.281648+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74145792 unmapped: 180224 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:22.281824+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74145792 unmapped: 180224 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:23.281987+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74153984 unmapped: 172032 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:24.282200+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172162400
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74153984 unmapped: 172032 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:25.282410+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74162176 unmapped: 163840 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:26.282607+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74162176 unmapped: 163840 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:27.282752+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74162176 unmapped: 163840 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:28.282909+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:29.283108+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:30.283252+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 60.023979187s of 60.028354645s, submitted: 1
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74178560 unmapped: 147456 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:31.283431+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74178560 unmapped: 147456 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:32.283636+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74178560 unmapped: 147456 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:33.283864+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74186752 unmapped: 139264 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:34.284030+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74186752 unmapped: 139264 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:35.284300+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74194944 unmapped: 131072 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:36.284538+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74194944 unmapped: 131072 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:37.284799+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74194944 unmapped: 131072 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:38.284973+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74203136 unmapped: 122880 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:39.285161+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74211328 unmapped: 114688 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:40.285464+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74219520 unmapped: 106496 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:41.285829+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74219520 unmapped: 106496 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:42.287878+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74219520 unmapped: 106496 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:43.288163+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74227712 unmapped: 98304 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:44.288771+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74227712 unmapped: 98304 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:45.290201+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74227712 unmapped: 98304 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:46.291519+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:47.292665+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:48.293675+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74244096 unmapped: 81920 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:49.294123+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74244096 unmapped: 81920 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:50.295012+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74252288 unmapped: 73728 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:51.295171+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Cumulative writes: 6316 writes, 26K keys, 6316 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 6316 writes, 1069 syncs, 5.91 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 6316 writes, 26K keys, 6316 commit groups, 1.0 writes per commit group, ingest: 19.65 MB, 0.03 MB/s
                                           Interval WAL: 6316 writes, 1069 syncs, 5.91 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:52.295350+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:53.295589+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74317824 unmapped: 8192 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:54.295766+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74317824 unmapped: 8192 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:55.295941+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 0 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:56.296807+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 0 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:57.297431+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 0 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:58.297664+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 0 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:49:59.297936+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:00.298260+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:01.298433+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:02.298589+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:03.298751+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:04.298910+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:05.299077+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:06.299291+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74350592 unmapped: 1024000 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:07.299480+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74350592 unmapped: 1024000 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:08.299755+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 1015808 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:09.299978+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 1015808 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:10.300207+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 1015808 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:11.300372+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1007616 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:12.300648+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1007616 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:13.300902+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1007616 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:14.301091+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 999424 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:15.301284+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 999424 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:16.301544+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74383360 unmapped: 991232 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:17.301738+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74383360 unmapped: 991232 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:18.301904+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74383360 unmapped: 991232 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:19.302132+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74391552 unmapped: 983040 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:20.302297+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74391552 unmapped: 983040 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:21.302410+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:22.302542+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:23.302761+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:24.302911+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74407936 unmapped: 966656 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:25.303057+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74407936 unmapped: 966656 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:26.303237+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74407936 unmapped: 966656 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:27.303425+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74416128 unmapped: 958464 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:28.303595+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74416128 unmapped: 958464 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:29.303753+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 950272 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:30.303896+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 950272 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:31.304028+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 950272 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:32.304251+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74432512 unmapped: 942080 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:33.304511+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74432512 unmapped: 942080 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:34.304724+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74432512 unmapped: 942080 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:35.304912+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 933888 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:36.305139+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 933888 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:37.305390+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 933888 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:38.305564+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74448896 unmapped: 925696 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:39.305772+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74448896 unmapped: 925696 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:40.305953+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:41.306101+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:42.306296+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:43.306523+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74465280 unmapped: 909312 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:44.306761+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74465280 unmapped: 909312 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:45.307041+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 901120 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:46.307288+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 901120 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:47.307467+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74481664 unmapped: 892928 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:48.307614+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74481664 unmapped: 892928 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:49.307763+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:50.307929+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74481664 unmapped: 892928 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:51.308094+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:52.308275+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:53.308473+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:54.308646+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74498048 unmapped: 876544 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:55.308800+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74498048 unmapped: 876544 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:56.309982+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74498048 unmapped: 876544 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:57.311899+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74506240 unmapped: 868352 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:58.314106+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74498048 unmapped: 876544 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:50:59.315216+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74506240 unmapped: 868352 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:00.315473+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74506240 unmapped: 868352 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:01.315665+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74506240 unmapped: 868352 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d172162400 session 0x55d17293ba40
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:02.315890+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74514432 unmapped: 860160 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:03.316075+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74514432 unmapped: 860160 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:04.316298+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:05.316495+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:06.316678+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:07.316850+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74530816 unmapped: 843776 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:08.317051+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74530816 unmapped: 843776 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:09.317230+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74539008 unmapped: 835584 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:10.317418+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74547200 unmapped: 827392 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:11.317598+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74547200 unmapped: 827392 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:12.317833+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74547200 unmapped: 827392 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:13.318053+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74555392 unmapped: 819200 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:14.318221+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74563584 unmapped: 811008 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:15.318508+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74563584 unmapped: 811008 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:16.318785+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74563584 unmapped: 811008 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:17.319020+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 794624 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:18.319197+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 794624 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:19.320118+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 786432 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:20.320287+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 786432 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:21.321402+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 786432 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:22.321582+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74596352 unmapped: 778240 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:23.321831+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74596352 unmapped: 778240 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:24.322019+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74596352 unmapped: 778240 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 114.382240295s of 114.386558533s, submitted: 1
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:25.322306+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:26.322505+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:27.322744+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:28.322999+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:29.323231+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931380 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:30.323418+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 737280 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:31.323844+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 737280 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:32.324045+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 720896 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:33.324224+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 720896 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:34.324387+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 720896 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931380 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:35.324577+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74670080 unmapped: 704512 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:36.324835+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74670080 unmapped: 704512 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:37.325050+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 696320 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:38.325357+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 696320 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:39.325555+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 696320 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931380 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:40.325804+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74686464 unmapped: 688128 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:41.326024+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74686464 unmapped: 688128 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:42.326250+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74694656 unmapped: 679936 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:43.326438+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74694656 unmapped: 679936 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:44.326646+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74694656 unmapped: 679936 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931380 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:45.326775+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74702848 unmapped: 671744 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:46.326993+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74702848 unmapped: 671744 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:47.327153+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74711040 unmapped: 663552 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:48.327324+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74711040 unmapped: 663552 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 23.977069855s of 23.989372253s, submitted: 2
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:49.327497+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74809344 unmapped: 1613824 heap: 76423168 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931380 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:50.327674+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76980224 unmapped: 491520 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:51.327920+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76980224 unmapped: 491520 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:52.328095+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76980224 unmapped: 491520 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:53.328247+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76980224 unmapped: 491520 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:54.328407+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76980224 unmapped: 491520 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931380 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:55.328582+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76980224 unmapped: 491520 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:56.328821+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76988416 unmapped: 483328 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:57.328985+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76988416 unmapped: 483328 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:58.329149+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76988416 unmapped: 483328 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:51:59.329351+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76988416 unmapped: 483328 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931380 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:00.329523+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76988416 unmapped: 483328 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:01.329676+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76988416 unmapped: 483328 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:02.329857+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76988416 unmapped: 483328 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:03.330034+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76996608 unmapped: 475136 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:04.330208+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76996608 unmapped: 475136 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931380 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:05.330389+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d171202800 session 0x55d1725c52c0
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76996608 unmapped: 475136 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:06.330572+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76996608 unmapped: 475136 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:07.330751+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76996608 unmapped: 475136 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:08.330864+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76996608 unmapped: 475136 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:09.330981+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77004800 unmapped: 466944 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931380 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:10.331111+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77004800 unmapped: 466944 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:11.331232+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77004800 unmapped: 466944 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:12.331340+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77004800 unmapped: 466944 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:13.331489+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77004800 unmapped: 466944 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:14.331615+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77004800 unmapped: 466944 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931380 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:15.331786+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77004800 unmapped: 466944 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:16.332005+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77004800 unmapped: 466944 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:17.332187+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77004800 unmapped: 466944 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:18.332365+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77004800 unmapped: 466944 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:19.332558+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77004800 unmapped: 466944 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931380 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:20.332774+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77012992 unmapped: 458752 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:21.332990+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77012992 unmapped: 458752 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172164400
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 32.814674377s of 33.227375031s, submitted: 154
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:22.333174+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77012992 unmapped: 458752 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:23.333412+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77012992 unmapped: 458752 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:24.333658+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77012992 unmapped: 458752 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932892 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:25.333774+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77012992 unmapped: 458752 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:26.333999+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77094912 unmapped: 376832 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:27.334189+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77193216 unmapped: 1327104 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:28.334331+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77193216 unmapped: 1327104 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:29.334532+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77209600 unmapped: 1310720 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:30.334764+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77201408 unmapped: 1318912 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:31.334939+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77201408 unmapped: 1318912 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:32.335103+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77201408 unmapped: 1318912 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:33.335298+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77209600 unmapped: 1310720 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:34.335466+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77209600 unmapped: 1310720 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:35.335655+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77217792 unmapped: 1302528 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:36.335902+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77217792 unmapped: 1302528 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:37.336302+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77217792 unmapped: 1302528 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:38.336431+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77217792 unmapped: 1302528 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:39.336582+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77217792 unmapped: 1302528 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:40.336733+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77217792 unmapped: 1302528 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:41.336972+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77217792 unmapped: 1302528 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:42.337192+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 1294336 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:43.337370+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 1294336 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:44.337536+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 1294336 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:45.337753+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 1294336 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:46.337987+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 1294336 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:47.338145+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 1294336 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:48.338337+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 1294336 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:49.338550+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 1294336 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:50.338792+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 1286144 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:51.338981+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 1286144 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:52.339158+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 1286144 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:53.339341+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 1286144 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:54.339518+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 1286144 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:55.339719+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 1286144 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:56.340795+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 1286144 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:57.340987+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 1286144 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:58.341327+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 1286144 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:52:59.341590+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 1286144 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:00.341844+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 1286144 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:01.342062+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 1286144 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:02.342259+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77242368 unmapped: 1277952 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:03.342546+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77242368 unmapped: 1277952 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:04.342953+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77242368 unmapped: 1277952 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:05.343203+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77242368 unmapped: 1277952 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:06.343507+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77242368 unmapped: 1277952 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:07.343819+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77242368 unmapped: 1277952 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:08.344040+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77242368 unmapped: 1277952 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:09.344230+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 1269760 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:10.344469+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 1269760 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:11.344626+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 1269760 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:12.344872+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 1269760 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:13.345047+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 1269760 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:14.345228+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 1269760 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:15.345396+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 1269760 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:16.345616+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 1269760 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:17.345802+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77258752 unmapped: 1261568 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:18.346000+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77258752 unmapped: 1261568 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:19.346177+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:20.346381+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:21.346554+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:22.347539+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:23.347839+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:24.348025+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:25.348191+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:26.348404+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:27.348572+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:28.348753+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:29.348928+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:30.349103+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:31.349267+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:32.349488+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:33.349763+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:34.349961+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:35.350127+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17042a000 session 0x55d17019eb40
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:36.350424+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:37.350657+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d172164400 session 0x55d172e48d20
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:38.350914+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:39.351065+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:40.351218+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:41.351428+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:42.351771+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:43.352015+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:44.352199+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:45.352386+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:46.352596+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:47.352893+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:48.353137+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:49.353374+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:50.353607+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:51.353837+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:52.354085+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:53.354491+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:54.354798+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 92.695404053s of 93.076416016s, submitted: 88
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:55.355007+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933813 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:56.355354+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:57.355690+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:58.357014+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:53:59.357289+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:00.357606+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933813 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:01.358256+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:02.358891+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:03.359398+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:04.359874+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:05.360264+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933222 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:06.360491+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:07.360817+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:08.361033+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:09.361300+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:10.361496+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933222 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:11.361666+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:12.361783+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:13.361942+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:14.362099+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:15.362275+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933222 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:16.362507+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:17.362655+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:18.362904+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:19.363230+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:20.363432+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933222 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:21.363686+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:22.363961+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:23.364195+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:24.364578+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:25.364856+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933222 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:26.365207+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:27.365379+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:28.365535+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:29.365686+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:30.365830+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933222 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:31.365963+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:32.366099+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:33.366486+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:34.366736+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:35.366943+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933222 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:36.367162+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:37.367400+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:38.367667+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:39.367885+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:40.368126+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77340672 unmapped: 1179648 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933222 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:41.368911+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77340672 unmapped: 1179648 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:42.369047+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77340672 unmapped: 1179648 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:43.369221+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:44.369413+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:45.369625+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933222 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:46.369838+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17042a000 session 0x55d1731454a0
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:47.370040+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:48.370261+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:49.370531+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:50.370784+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933222 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:51.371050+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:52.371270+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:53.371495+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:54.371747+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:55.372035+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933222 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:56.372403+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:57.372696+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:58.372907+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:54:59.373279+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:00.373444+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77373440 unmapped: 1146880 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933222 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:01.373617+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77373440 unmapped: 1146880 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:02.373787+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77373440 unmapped: 1146880 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:03.373933+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a800
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 69.010238647s of 69.024299622s, submitted: 2
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:04.374080+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:05.374256+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934734 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:06.374426+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:07.374610+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:08.374753+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:09.375353+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:10.375607+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934143 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:11.375784+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:12.375923+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:13.376096+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:14.376230+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:15.376436+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934143 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:16.376604+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:17.376756+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:18.377408+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:19.378607+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:20.378743+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77398016 unmapped: 1122304 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934143 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:21.379438+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77398016 unmapped: 1122304 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:22.379572+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77398016 unmapped: 1122304 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:23.379792+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77398016 unmapped: 1122304 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:24.380134+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77398016 unmapped: 1122304 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:25.380369+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77398016 unmapped: 1122304 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934143 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:26.380692+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77398016 unmapped: 1122304 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:27.380922+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77398016 unmapped: 1122304 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:28.381101+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77398016 unmapped: 1122304 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:29.381244+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77406208 unmapped: 1114112 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:30.381378+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77406208 unmapped: 1114112 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934143 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:31.381547+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77406208 unmapped: 1114112 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:32.381763+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77406208 unmapped: 1114112 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:33.381947+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77406208 unmapped: 1114112 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:34.382132+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77406208 unmapped: 1114112 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:35.382304+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17042a800 session 0x55d1703c43c0
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77406208 unmapped: 1114112 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934143 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:36.382512+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77406208 unmapped: 1114112 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:37.382676+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77406208 unmapped: 1114112 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:38.382864+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77406208 unmapped: 1114112 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:39.383049+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77422592 unmapped: 1097728 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:40.383271+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77422592 unmapped: 1097728 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934143 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:41.383416+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77422592 unmapped: 1097728 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:42.383565+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77422592 unmapped: 1097728 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:43.383809+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77422592 unmapped: 1097728 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:44.383950+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77422592 unmapped: 1097728 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:45.384155+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77422592 unmapped: 1097728 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934143 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:46.384388+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77422592 unmapped: 1097728 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:47.384598+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77422592 unmapped: 1097728 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:48.384816+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77422592 unmapped: 1097728 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:49.384978+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 45.224506378s of 45.233642578s, submitted: 2
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 1064960 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:50.385136+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 1064960 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935655 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:51.385307+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 1064960 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:52.385493+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172162400
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 1064960 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:53.385623+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 1064960 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:54.385818+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 1064960 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:55.385963+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 1056768 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937167 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:56.386134+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 1056768 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:57.386299+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 1056768 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:58.386455+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 1056768 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:55:59.386583+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77479936 unmapped: 1040384 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:00.386734+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77479936 unmapped: 1040384 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936576 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:01.386895+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d171202800 session 0x55d17311cd20
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77479936 unmapped: 1040384 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:02.387044+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77479936 unmapped: 1040384 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:03.387247+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77479936 unmapped: 1040384 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:04.387428+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:05.387616+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936576 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:06.387845+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:07.388039+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:08.388331+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:09.388640+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:10.388863+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936576 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:11.389028+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:12.389158+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:13.389429+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:14.389629+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:15.389889+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936576 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:16.390205+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:17.390351+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:18.390533+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172163800
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 1015808 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:19.390669+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 991232 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:20.390785+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 991232 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936576 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:21.390931+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 991232 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 32.827796936s of 32.840641022s, submitted: 3
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:22.391062+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:23.391188+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:24.391364+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:25.391875+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:26.392076+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:27.392254+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:28.392440+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:29.392628+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:30.392780+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:31.392940+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:32.393071+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:33.393255+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:34.393449+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:35.393657+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:36.393902+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:37.394088+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d172162400 session 0x55d170046780
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:38.394330+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:39.394543+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77561856 unmapped: 958464 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:40.394778+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77561856 unmapped: 958464 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:41.394954+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:42.395112+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:43.395244+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:44.395382+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:45.395534+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:46.395736+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:47.395909+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:48.396033+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:49.396220+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:50.396382+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:51.396538+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:52.396685+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:53.396864+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:54.396997+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:55.397114+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:56.397331+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77578240 unmapped: 942080 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:57.397477+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77578240 unmapped: 942080 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:58.397631+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77586432 unmapped: 933888 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:56:59.397760+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:00.397910+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:01.398109+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:02.398302+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:03.398474+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:04.398603+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:05.398849+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:06.399090+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:07.399273+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:08.399508+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:09.399741+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:10.399957+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:11.400149+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:12.400326+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:13.400545+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:14.400746+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:15.400921+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:16.401089+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:17.401279+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:18.401482+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:19.401656+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:20.401832+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:21.401996+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:22.402186+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:23.402365+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:24.402543+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:25.402763+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:26.402969+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d170b6b400 session 0x55d1721d3a40
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172162400
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:27.403125+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:28.403405+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:29.403584+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:30.403771+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:31.403930+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:32.404077+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:33.404199+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:34.404351+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77627392 unmapped: 892928 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:35.404498+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77627392 unmapped: 892928 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:36.404773+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 74.881858826s of 74.889221191s, submitted: 2
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77643776 unmapped: 876544 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:37.404938+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77651968 unmapped: 868352 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:38.405062+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77651968 unmapped: 868352 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:39.405214+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17042a000 session 0x55d1720950e0
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:40.405906+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936906 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:41.406076+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:42.406268+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:43.406447+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:44.406599+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:45.406775+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936906 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:46.406955+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:47.407100+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:48.407257+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:49.407407+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:50.407589+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936906 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:51.407749+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:52.407879+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:53.408047+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:54.408167+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.996082306s of 18.000623703s, submitted: 1
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:55.408267+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938418 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:56.408435+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:57.408563+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a800
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:58.408741+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:59.408908+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:00.409049+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938418 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:01.409163+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:02.409298+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:03.409475+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:04.409669+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:05.409814+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938418 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:06.410089+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:07.410245+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:08.410430+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:09.410612+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.481588364s of 14.488451958s, submitted: 1
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77701120 unmapped: 819200 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:10.411328+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77709312 unmapped: 811008 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939339 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:11.411519+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77709312 unmapped: 811008 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:12.411746+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77709312 unmapped: 811008 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:13.411916+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77709312 unmapped: 811008 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:14.412301+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:15.412416+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77709312 unmapped: 811008 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:16.412596+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 794624 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:17.412802+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 794624 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:18.412920+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 794624 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:19.413075+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 794624 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:20.413231+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:21.413378+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:22.413559+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:23.413717+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:24.413869+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:25.414014+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:26.414236+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:27.414428+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:28.414595+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:29.414771+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:30.414914+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:31.415099+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:32.415256+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:33.415412+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:34.415546+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:35.415770+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:36.415924+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:37.416087+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:38.416229+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:39.416405+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:40.416620+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:41.416877+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:42.417047+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:43.417402+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:44.417536+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:45.417679+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:46.417884+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:47.418058+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:48.418887+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:49.419050+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17042a800 session 0x55d172095c20
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:50.419287+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17035d000 session 0x55d17260d0e0
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172164400
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:51.419520+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:52.419815+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:53.420029+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:54.420194+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:55.420340+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:56.421003+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:57.421136+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:58.421321+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 729088 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:59.421563+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 729088 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:00.421725+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:01.421886+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:02.422063+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:03.422273+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:04.422467+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:05.422651+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:06.422946+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17035d000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 56.857223511s of 56.950168610s, submitted: 4
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:07.423142+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:08.423273+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:09.423401+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:10.423540+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:11.423825+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:12.424019+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:13.424333+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:14.424776+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:15.425546+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:16.427011+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:17.427191+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:18.427399+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:19.427519+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 696320 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:20.427767+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77840384 unmapped: 679936 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:21.428854+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77840384 unmapped: 679936 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:22.429004+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77840384 unmapped: 679936 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:23.429191+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:24.429318+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:25.429459+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:26.429657+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:27.429810+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:28.429966+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:29.430089+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:30.430231+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:31.430401+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:32.430610+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:33.430786+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:34.430929+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:35.431097+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:36.431280+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:37.431426+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:38.431637+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:39.431764+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:40.431909+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:41.432046+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:42.432180+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:43.432423+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:44.432586+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:45.432723+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:46.432867+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:47.433007+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:48.433128+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17035d000 session 0x55d172e81860
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:49.433272+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:50.433463+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:51.433592+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 6838 writes, 27K keys, 6838 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 6838 writes, 1330 syncs, 5.14 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 522 writes, 785 keys, 522 commit groups, 1.0 writes per commit group, ingest: 0.25 MB, 0.00 MB/s
                                           Interval WAL: 522 writes, 261 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:52.433772+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:53.433978+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:54.434341+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:55.434533+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:56.434847+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:57.434979+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:58.435336+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:59.435477+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77881344 unmapped: 638976 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:00.435868+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77881344 unmapped: 638976 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:01.436000+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77881344 unmapped: 638976 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:02.436137+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77881344 unmapped: 638976 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:03.436267+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77881344 unmapped: 638976 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:04.436390+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77889536 unmapped: 630784 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254c000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 58.666145325s of 58.674335480s, submitted: 2
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:05.436531+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77889536 unmapped: 630784 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:06.436766+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941181 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77889536 unmapped: 630784 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:07.436881+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77889536 unmapped: 630784 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:08.436991+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77889536 unmapped: 630784 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread fragmentation_score=0.000026 took=0.000118s
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:09.437161+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:10.437320+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:11.437455+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:12.437741+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:13.437938+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:14.438133+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:15.438387+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:16.438602+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:17.438785+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:18.438917+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:19.439082+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:20.439207+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:21.439327+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:22.439550+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:23.439721+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:24.439888+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:25.440022+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:26.440137+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:27.440291+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:28.440669+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:29.440800+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:30.440944+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:31.441068+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:32.441257+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:33.441383+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:34.441514+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:35.441657+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77930496 unmapped: 589824 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:36.441881+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77930496 unmapped: 589824 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:37.442038+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77930496 unmapped: 589824 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:38.442157+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77930496 unmapped: 589824 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:39.442286+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:40.442420+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:41.442537+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d171202800 session 0x55d17260c1e0
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:42.442671+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:43.442776+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:44.442909+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:45.443060+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:46.443271+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:47.443433+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:48.443591+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:49.443757+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:50.443901+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:51.444077+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:52.444261+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:53.444396+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:54.444528+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:55.444778+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:56.444993+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:57.445158+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d172163800 session 0x55d16feeed20
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:58.445342+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:59.445572+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:00.445772+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:01.445927+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 56.543655396s of 56.563098907s, submitted: 2
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939999 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:02.446072+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:03.446231+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:04.446410+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:05.446545+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:06.446765+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939999 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:07.446944+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:08.447108+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:09.447257+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:10.447489+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:11.447617+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939999 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.324966431s of 10.332220078s, submitted: 1
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:12.447762+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:13.447885+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:14.448020+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:15.448211+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:16.448417+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:17.448570+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:18.448763+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:19.448888+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77979648 unmapped: 540672 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:20.449055+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77979648 unmapped: 540672 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:21.449222+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:22.449378+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:23.449548+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:24.449755+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:25.449918+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:26.450179+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:27.450335+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:28.450496+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:29.450680+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:30.450852+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:31.450994+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:32.451143+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:33.451290+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:34.451483+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:35.451593+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:36.451685+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77996032 unmapped: 524288 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:37.451812+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77996032 unmapped: 524288 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:38.451955+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77996032 unmapped: 524288 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:39.452067+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:40.452189+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:41.452343+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:42.452488+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:43.452609+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:44.452759+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:45.452962+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:46.453211+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:47.453365+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:48.453501+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 36.846134186s of 36.849975586s, submitted: 1
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78102528 unmapped: 417792 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:49.453613+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78290944 unmapped: 229376 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:50.453774+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78323712 unmapped: 196608 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:51.453896+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78323712 unmapped: 196608 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:52.454020+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78323712 unmapped: 196608 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:53.454190+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78323712 unmapped: 196608 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:54.454436+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78323712 unmapped: 196608 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:55.454591+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78331904 unmapped: 188416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:56.454796+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78331904 unmapped: 188416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:57.454950+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78331904 unmapped: 188416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:58.455096+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78331904 unmapped: 188416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:59.455282+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78331904 unmapped: 188416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:00.455502+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78331904 unmapped: 188416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:01.455691+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:02.455883+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:03.456069+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:04.456243+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:05.456384+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:06.456564+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:07.456732+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:08.456893+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:09.457061+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:10.457196+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:11.457364+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:12.457506+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:13.457629+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:14.457812+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:15.457955+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:16.458140+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:17.458291+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:18.458428+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:19.458572+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:20.458827+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:21.459155+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:22.459380+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:23.459576+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:24.459743+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78356480 unmapped: 163840 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:25.459883+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78356480 unmapped: 163840 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:26.460074+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78356480 unmapped: 163840 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:27.460550+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 37.952041626s of 38.464164734s, submitted: 139
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78381056 unmapped: 1187840 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:28.460685+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:29.461524+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:30.461675+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:31.462302+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:32.462511+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:33.462789+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:34.462905+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:35.463055+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:36.463224+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78422016 unmapped: 1146880 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:37.463378+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78422016 unmapped: 1146880 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:38.463536+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 1138688 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:39.463775+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 1138688 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:40.463934+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 1138688 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:41.464161+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 1138688 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:42.464294+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 1138688 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:43.464441+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 1138688 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:44.464591+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:45.464777+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:46.464945+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:47.465065+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:48.465193+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:49.465353+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:50.465479+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:51.465692+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:52.465864+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:53.466000+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:54.466156+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:55.466308+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:56.466458+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:57.466617+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:58.466760+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:59.466890+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:00.467039+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:01.467203+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:02.467349+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:03.467487+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:04.467636+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:05.467818+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:06.468033+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:07.468345+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:08.468593+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:09.468798+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:10.469047+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:11.469363+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:12.469580+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:13.469913+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:14.470096+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:15.470353+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:16.470543+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:17.470719+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:18.470906+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:19.471230+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:20.471407+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 52.869400024s of 53.368377686s, submitted: 86
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:21.471569+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:22.471798+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943023 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:23.472005+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:24.472174+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:25.472388+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:26.472623+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:27.472758+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943023 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:28.472904+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:29.473103+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:30.473275+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:31.473436+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:32.473578+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943023 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.398575783s of 12.402492523s, submitted: 1
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:33.473748+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:34.473919+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:35.474120+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17035d000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:36.474318+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17254c000 session 0x55d17293a5a0
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:37.474516+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943944 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:38.474668+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:39.474862+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:40.475044+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:41.475254+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:42.475397+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943944 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:43.475554+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:44.475750+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:45.475868+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:46.476042+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:47.476168+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943944 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:48.476328+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:49.476485+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:50.476645+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:51.476832+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:52.476967+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943944 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:53.477141+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.989221573s of 20.182910919s, submitted: 2
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:54.477280+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:55.477439+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:56.477604+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:57.477751+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 945456 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:58.477902+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:59.478039+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:00.478177+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:01.478306+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:02.478523+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:03.478684+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:04.478867+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:05.479002+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:06.479194+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:07.479367+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:08.479541+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:09.479792+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:10.479944+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:11.480086+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:12.480391+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:13.481667+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:14.481977+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:15.482757+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:16.483065+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:17.483563+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:18.483865+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:19.484041+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:20.484275+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:21.484534+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:22.484748+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:23.484938+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:24.485095+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:25.485242+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:26.485487+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:27.485947+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:28.486179+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:29.486507+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:30.486755+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:31.486991+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:32.487158+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:33.487298+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:34.487480+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:35.487747+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:36.487988+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:37.488262+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:38.488424+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:39.488578+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:40.488736+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:41.488886+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:42.489031+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:43.489175+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:44.489316+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:45.489884+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:46.490246+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:47.490882+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:48.491226+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 1097728 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:49.491418+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 1097728 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:50.491797+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 1097728 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:51.492298+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17035d000 session 0x55d16feee1e0
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 1097728 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:52.492546+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 1097728 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:53.492869+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 1097728 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:54.493138+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78479360 unmapped: 1089536 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:55.493300+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78479360 unmapped: 1089536 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:56.493465+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78479360 unmapped: 1089536 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:57.493774+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78479360 unmapped: 1089536 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:58.494008+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78479360 unmapped: 1089536 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:59.494216+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:00.495160+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:01.495408+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:02.495658+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:03.495849+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:04.495998+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:05.496206+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:06.496523+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:07.496820+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:08.497071+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:09.497321+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:10.497603+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:11.497871+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:12.498044+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 handle_osd_map epochs [144,144], i have 143, src has [1,144]
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 79.554496765s of 79.564147949s, submitted: 3
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a800
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _renew_subs
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 143 handle_osd_map epochs [144,144], i have 144, src has [1,144]
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78536704 unmapped: 1032192 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:13.498180+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78536704 unmapped: 1032192 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 144 handle_osd_map epochs [144,145], i have 144, src has [1,145]
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:14.498373+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 79724544 unmapped: 16629760 heap: 96354304 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:15.498519+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 146 ms_handle_reset con 0x55d171202800 session 0x55d17263d4a0
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042ac00
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80936960 unmapped: 23814144 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:16.498740+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 146 heartbeat osd_stat(store_statfs(0x4fb58a000/0x0/0x4ffc00000, data 0x15d8375/0x1691000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80936960 unmapped: 23814144 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:17.498898+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _renew_subs
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 146 handle_osd_map epochs [147,147], i have 146, src has [1,147]
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 147 ms_handle_reset con 0x55d17042ac00 session 0x55d172584d20
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1101069 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:18.499051+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:19.499199+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:20.499397+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 147 heartbeat osd_stat(store_statfs(0x4fb585000/0x0/0x4ffc00000, data 0x15da4a0/0x1695000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 147 handle_osd_map epochs [148,148], i have 147, src has [1,148]
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 147 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:21.499536+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:22.499643+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:23.499779+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:24.499930+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:25.500065+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:26.500280+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:27.500436+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:28.500593+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:29.500764+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:30.500901+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:31.501038+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:32.501197+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:33.514913+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:34.515077+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:35.515243+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:36.515435+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:37.515576+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80961536 unmapped: 23789568 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:38.515740+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80961536 unmapped: 23789568 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:39.515910+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:40.516060+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:41.516202+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:42.516317+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:43.516472+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:44.516552+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:45.516684+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:46.516860+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:47.516981+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:48.517117+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:49.517256+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:50.517410+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:51.517553+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:52.517677+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:53.517808+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 40.111835480s of 40.450466156s, submitted: 51
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:54.518038+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:55.518206+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:56.518410+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:57.518563+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1104407 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:58.518783+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:59.518921+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:00.519085+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb584000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:01.519226+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80986112 unmapped: 23764992 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:02.519383+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80986112 unmapped: 23764992 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102976 data_alloc: 218103808 data_used: 176128
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:03.519591+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80986112 unmapped: 23764992 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172151000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 148 ms_handle_reset con 0x55d172151000 session 0x55d172095e00
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17035d000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 148 ms_handle_reset con 0x55d17035d000 session 0x55d172672960
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:04.519761+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 92413952 unmapped: 12337152 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042ac00
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb584000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:05.519919+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 92504064 unmapped: 12247040 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 148 handle_osd_map epochs [148,149], i have 148, src has [1,149]
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.857666016s of 12.088842392s, submitted: 2
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:06.520212+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 92504064 unmapped: 12247040 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _renew_subs
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 149 handle_osd_map epochs [150,150], i have 149, src has [1,150]
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:07.520413+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94339072 unmapped: 10412032 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 150 heartbeat osd_stat(store_statfs(0x4fb57c000/0x0/0x4ffc00000, data 0x15e06b1/0x169f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 150 ms_handle_reset con 0x55d17042ac00 session 0x55d17019b680
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 150 ms_handle_reset con 0x55d171202800 session 0x55d172671680
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254c000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 150 ms_handle_reset con 0x55d17254c000 session 0x55d172671860
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172153400
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1167114 data_alloc: 234881024 data_used: 11649024
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:08.520595+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94339072 unmapped: 10412032 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 150 ms_handle_reset con 0x55d172153400 session 0x55d172671a40
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17035d000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 150 ms_handle_reset con 0x55d17035d000 session 0x55d172671c20
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:09.520772+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94339072 unmapped: 10412032 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:10.520928+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94339072 unmapped: 10412032 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 150 heartbeat osd_stat(store_statfs(0x4fb234000/0x0/0x4ffc00000, data 0x19286b1/0x19e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:11.521067+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94339072 unmapped: 10412032 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:12.521176+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94339072 unmapped: 10412032 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042ac00
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 150 ms_handle_reset con 0x55d17042ac00 session 0x55d1726714a0
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 150 handle_osd_map epochs [150,151], i have 150, src has [1,151]
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170865 data_alloc: 234881024 data_used: 11649024
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254c000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:13.521281+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94191616 unmapped: 10559488 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a000 session 0x55d170188f00
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:14.521385+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97239040 unmapped: 7512064 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fb231000/0x0/0x4ffc00000, data 0x192a683/0x19ea000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:15.521547+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:16.521725+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:17.521846+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192885 data_alloc: 234881024 data_used: 14888960
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:18.522016+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fb231000/0x0/0x4ffc00000, data 0x192a683/0x19ea000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:19.522143+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:20.522285+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:21.522446+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fb231000/0x0/0x4ffc00000, data 0x192a683/0x19ea000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:22.522555+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192885 data_alloc: 234881024 data_used: 14888960
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:23.522669+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:24.522825+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.199676514s of 19.449176788s, submitted: 39
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:25.522952+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102621184 unmapped: 4235264 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:26.523116+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102588416 unmapped: 4268032 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa917000/0x0/0x4ffc00000, data 0x2245683/0x2305000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:27.523232+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102457344 unmapped: 4399104 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1274279 data_alloc: 234881024 data_used: 15114240
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:28.523345+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102457344 unmapped: 4399104 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa876000/0x0/0x4ffc00000, data 0x22e6683/0x23a6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:29.523464+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102457344 unmapped: 4399104 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:30.523562+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 4210688 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:31.523776+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 4210688 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:32.523915+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102244352 unmapped: 4612096 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1272359 data_alloc: 234881024 data_used: 15114240
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:33.524044+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102244352 unmapped: 4612096 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:34.524385+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa852000/0x0/0x4ffc00000, data 0x230a683/0x23ca000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102244352 unmapped: 4612096 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:35.524538+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102244352 unmapped: 4612096 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:36.524741+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102244352 unmapped: 4612096 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:37.524894+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102244352 unmapped: 4612096 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.935168266s of 12.713699341s, submitted: 84
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1272375 data_alloc: 234881024 data_used: 15114240
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:38.525059+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:39.525224+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:40.525379+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa847000/0x0/0x4ffc00000, data 0x2315683/0x23d5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:41.525540+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:42.525694+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa847000/0x0/0x4ffc00000, data 0x2315683/0x23d5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271784 data_alloc: 234881024 data_used: 15114240
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:43.525858+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:44.525994+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:45.526134+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102096896 unmapped: 4759552 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17250dc00
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:46.526278+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102096896 unmapped: 4759552 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa844000/0x0/0x4ffc00000, data 0x2318683/0x23d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:47.526445+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d172672d20
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172994000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172994000 session 0x55d172896780
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17035d000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17035d000 session 0x55d17019e000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102088704 unmapped: 4767744 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a000 session 0x55d1725c54a0
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042ac00
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1273484 data_alloc: 234881024 data_used: 15114240
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:48.526586+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042ac00 session 0x55d1725c5680
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103366656 unmapped: 3489792 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa844000/0x0/0x4ffc00000, data 0x2318683/0x23d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:49.526736+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d17266eb40
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172096800
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.664448738s of 11.864383698s, submitted: 7
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103407616 unmapped: 9748480 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d170d8cd20
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17035d000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17035d000 session 0x55d1726721e0
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a000 session 0x55d17034ef00
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042ac00
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042ac00 session 0x55d1703c14a0
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d17289b680
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa844000/0x0/0x4ffc00000, data 0x2318683/0x23d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:50.526897+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103424000 unmapped: 9732096 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa2b2000/0x0/0x4ffc00000, data 0x28aa683/0x296a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:51.527128+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103424000 unmapped: 9732096 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:52.527287+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103424000 unmapped: 9732096 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1318619 data_alloc: 234881024 data_used: 15642624
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:53.527411+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103424000 unmapped: 9732096 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:54.527603+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103424000 unmapped: 9732096 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa2b2000/0x0/0x4ffc00000, data 0x28aa683/0x296a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172661400
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172661400 session 0x55d16f9e6960
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:55.527751+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103055360 unmapped: 10100736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:56.527910+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103055360 unmapped: 10100736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17035d000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:57.528047+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 104669184 unmapped: 8486912 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1355120 data_alloc: 234881024 data_used: 20631552
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:58.528184+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108175360 unmapped: 4980736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17019ab40
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:59.528327+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108175360 unmapped: 4980736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:00.528500+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108175360 unmapped: 4980736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa2b1000/0x0/0x4ffc00000, data 0x28aa683/0x296a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:01.528664+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108175360 unmapped: 4980736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:02.528794+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108175360 unmapped: 4980736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1360592 data_alloc: 234881024 data_used: 21483520
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:03.528918+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.041341782s of 14.129011154s, submitted: 19
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108175360 unmapped: 4980736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa2b1000/0x0/0x4ffc00000, data 0x28aa683/0x296a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:04.529081+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108208128 unmapped: 4947968 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa2af000/0x0/0x4ffc00000, data 0x28ad683/0x296d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:05.529162+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 4915200 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:06.529346+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108249088 unmapped: 4907008 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa2af000/0x0/0x4ffc00000, data 0x28ad683/0x296d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:07.529462+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108249088 unmapped: 4907008 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1360332 data_alloc: 234881024 data_used: 21483520
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:08.529597+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109010944 unmapped: 4145152 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:09.529766+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115154944 unmapped: 2949120 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:10.529966+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 112525312 unmapped: 5578752 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:11.530108+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113344512 unmapped: 4759552 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f83e5000/0x0/0x4ffc00000, data 0x31b9683/0x3279000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:12.530235+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113393664 unmapped: 4710400 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1447918 data_alloc: 234881024 data_used: 22425600
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:13.530385+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113393664 unmapped: 4710400 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:14.530537+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113393664 unmapped: 4710400 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:15.530681+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113582080 unmapped: 4521984 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042ac00
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.826783180s of 12.373358727s, submitted: 98
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:16.530859+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f83e5000/0x0/0x4ffc00000, data 0x31b9683/0x3279000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:17.531005+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1442638 data_alloc: 234881024 data_used: 22425600
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:18.531196+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:19.531393+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:20.531565+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:21.531725+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f83ea000/0x0/0x4ffc00000, data 0x31c2683/0x3282000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:22.531931+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17035d000 session 0x55d17289a000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a000 session 0x55d1703c6d20
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17250dc00 session 0x55d1730cab40
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:23.532068+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1370749 data_alloc: 234881024 data_used: 18501632
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042ac00 session 0x55d170d8c1e0
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d172e48000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:24.532256+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:25.532808+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:26.533501+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:27.533837+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f905a000/0x0/0x4ffc00000, data 0x2327683/0x23e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:28.534389+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1285633 data_alloc: 234881024 data_used: 15642624
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.467782021s of 12.590607643s, submitted: 41
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:29.534762+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9058000/0x0/0x4ffc00000, data 0x2327683/0x23e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:30.535279+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9058000/0x0/0x4ffc00000, data 0x2327683/0x23e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d172670f00
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c000 session 0x55d172ba1860
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:31.535434+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109109248 unmapped: 8994816 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:32.535974+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9283000/0x0/0x4ffc00000, data 0x2327683/0x23e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [0,1])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106586112 unmapped: 11517952 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1728970e0
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:33.536094+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1166733 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:34.536286+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:35.536549+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:36.537000+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:37.537204+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:38.537502+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1166733 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:39.537694+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17035d000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.436046600s of 11.544039726s, submitted: 32
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:40.537963+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:41.538173+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:42.538356+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:43.538549+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1168245 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:44.538778+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:45.538934+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:46.539216+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:47.539406+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:48.539554+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1168245 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:49.539805+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:50.540038+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:51.540837+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:52.541669+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:53.541816+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1168245 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:54.542011+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:55.542165+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:56.542398+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:57.543171+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:58.553752+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1168245 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:59.554514+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:00.554665+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:01.554873+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a000 session 0x55d172ba0780
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172ba1a40
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d172ba1680
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:02.555100+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d172ba14a0
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254c000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c000 session 0x55d172ba0d20
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:03.555356+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17250dc00
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17250dc00 session 0x55d1701881e0
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 23.435865402s of 23.439153671s, submitted: 1
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:39 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:39 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170045 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d170188f00
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d170d8c960
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d170d8c000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254c000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c000 session 0x55d170d8c5a0
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172993000
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172993000 session 0x55d16f9e6960
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:04.556133+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:05.556608+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d16f9e7a40
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:39 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:06.556966+0000)
Jan 26 10:19:39 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:39 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9d30000/0x0/0x4ffc00000, data 0x187b693/0x193c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:07.557298+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d16f9e6000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:08.557625+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193785 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d16f9e65a0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254c000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c000 session 0x55d172ee6b40
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:09.557761+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172096400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:10.557878+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169c00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106422272 unmapped: 13852672 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9d30000/0x0/0x4ffc00000, data 0x187b693/0x193c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:11.558128+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:12.558533+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:13.558865+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212814 data_alloc: 234881024 data_used: 14893056
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:14.559184+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9d30000/0x0/0x4ffc00000, data 0x187b693/0x193c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:15.559343+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:16.559505+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:17.559666+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:18.559819+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212814 data_alloc: 234881024 data_used: 14893056
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:19.560128+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9d30000/0x0/0x4ffc00000, data 0x187b693/0x193c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 13254656 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:20.560271+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 13254656 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:21.560428+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9d30000/0x0/0x4ffc00000, data 0x187b693/0x193c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 13254656 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:22.560635+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.067176819s of 19.197723389s, submitted: 13
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 10002432 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9d30000/0x0/0x4ffc00000, data 0x187b693/0x193c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:23.560797+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271222 data_alloc: 234881024 data_used: 15110144
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109248512 unmapped: 11026432 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9553000/0x0/0x4ffc00000, data 0x2052693/0x2113000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:24.560963+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:25.561133+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:26.561309+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:27.561506+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:28.561771+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278384 data_alloc: 234881024 data_used: 14929920
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:29.561967+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:30.562142+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:31.562364+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:32.562607+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:33.562820+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278384 data_alloc: 234881024 data_used: 14929920
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:34.562958+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:35.563107+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:36.563272+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:37.563491+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:38.563668+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278384 data_alloc: 234881024 data_used: 14929920
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:39.563815+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:40.563969+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:41.564180+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:42.565832+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:43.565972+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278384 data_alloc: 234881024 data_used: 14929920
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:44.566118+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:45.566251+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:46.566380+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:47.566504+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:48.566633+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278384 data_alloc: 234881024 data_used: 14929920
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:49.566840+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:50.566989+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:51.567092+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109895680 unmapped: 10379264 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:52.567227+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096400 session 0x55d17263d2c0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 30.033960342s of 30.179452896s, submitted: 56
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169c00 session 0x55d1730ca5a0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109903872 unmapped: 10371072 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:53.567362+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172670000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1176888 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:54.567648+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:55.567794+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:56.567949+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:57.568097+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:58.568241+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1176888 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:59.568385+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:00.568577+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:01.568696+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:02.568887+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:03.569018+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1176888 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:04.569184+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:05.569316+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:06.569472+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:07.569590+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:08.569747+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1176888 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:09.569880+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:10.570004+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:11.570138+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:12.570270+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:13.570450+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1176888 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:14.570619+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:15.570753+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:16.570896+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:17.570991+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d172effe00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d1726e9680
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254c000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c000 session 0x55d172b9e1e0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172e81c20
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 25.123188019s of 25.234869003s, submitted: 31
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 116154368 unmapped: 21454848 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d172672780
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d172896f00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169c00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169c00 session 0x55d170400d20
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172997800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:18.571106+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172997800 session 0x55d170400000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1703e32c0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1292908 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:19.571243+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:20.571399+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:21.571576+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:22.571720+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:23.571853+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1292908 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:24.571975+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:25.572121+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d170047680
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:26.572274+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169c00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:27.572396+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 19595264 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:28.572623+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1402500 data_alloc: 251658240 data_used: 28434432
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 19595264 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:29.572749+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 19595264 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:30.572955+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 19595264 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:31.573139+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 19595264 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:32.573278+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 19562496 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:33.573418+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1402500 data_alloc: 251658240 data_used: 28434432
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 19562496 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:34.573629+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 19562496 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:35.573771+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 19562496 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:36.573929+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 19562496 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:37.574063+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 20.038774490s of 20.122617722s, submitted: 17
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 17965056 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:38.574179+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1445412 data_alloc: 251658240 data_used: 28798976
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122085376 unmapped: 15523840 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:39.574326+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122789888 unmapped: 14819328 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:40.574449+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8a3b000/0x0/0x4ffc00000, data 0x2b71683/0x2c31000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122822656 unmapped: 14786560 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:41.574632+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122822656 unmapped: 14786560 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:42.574785+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122822656 unmapped: 14786560 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:43.574916+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1458516 data_alloc: 251658240 data_used: 29515776
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122822656 unmapped: 14786560 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:44.575040+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122822656 unmapped: 14786560 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:45.575154+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8a2b000/0x0/0x4ffc00000, data 0x2b81683/0x2c41000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122855424 unmapped: 14753792 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:46.575458+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122855424 unmapped: 14753792 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:47.575734+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122863616 unmapped: 14745600 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:48.575930+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1458516 data_alloc: 251658240 data_used: 29515776
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122863616 unmapped: 14745600 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:49.576080+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122863616 unmapped: 14745600 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:50.576200+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8a2b000/0x0/0x4ffc00000, data 0x2b81683/0x2c41000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122896384 unmapped: 14712832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:51.576402+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.1 total, 600.0 interval
                                           Cumulative writes: 8201 writes, 31K keys, 8201 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 8201 writes, 1912 syncs, 4.29 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1363 writes, 4432 keys, 1363 commit groups, 1.0 writes per commit group, ingest: 4.48 MB, 0.01 MB/s
                                           Interval WAL: 1363 writes, 582 syncs, 2.34 writes per sync, written: 0.00 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122896384 unmapped: 14712832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:52.576587+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122896384 unmapped: 14712832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:53.576744+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8a2b000/0x0/0x4ffc00000, data 0x2b81683/0x2c41000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1458516 data_alloc: 251658240 data_used: 29515776
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122896384 unmapped: 14712832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:54.576892+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122896384 unmapped: 14712832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:55.577028+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122896384 unmapped: 14712832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:56.577207+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172494400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d173144f00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254c400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c400 session 0x55d172037a40
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254d000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254d000 session 0x55d1730ca5a0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254d000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254d000 session 0x55d1730caf00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 18.439634323s of 18.733009338s, submitted: 55
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1730cbc20
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d172896960
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172494400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d172ee7c20
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254c400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c400 session 0x55d170188960
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 15179776 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:57.577342+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254c400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c400 session 0x55d172ee6b40
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:58.577498+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 15179776 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f87cb000/0x0/0x4ffc00000, data 0x2de06e5/0x2ea1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1476366 data_alloc: 251658240 data_used: 29515776
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:59.577687+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 15179776 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:00.577873+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 15179776 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f87cb000/0x0/0x4ffc00000, data 0x2de06e5/0x2ea1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:01.578037+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 15179776 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17259ab40
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:02.578184+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 121585664 unmapped: 16023552 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172494400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:03.578355+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 121585664 unmapped: 16023552 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f87ca000/0x0/0x4ffc00000, data 0x2de0708/0x2ea2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1481927 data_alloc: 251658240 data_used: 30072832
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:04.578466+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 121872384 unmapped: 15736832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:05.578628+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123502592 unmapped: 14106624 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:06.578855+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123502592 unmapped: 14106624 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:07.579012+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123502592 unmapped: 14106624 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:08.579158+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123502592 unmapped: 14106624 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1494239 data_alloc: 251658240 data_used: 31916032
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:09.579392+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123502592 unmapped: 14106624 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f87ca000/0x0/0x4ffc00000, data 0x2de0708/0x2ea2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:10.579579+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123510784 unmapped: 14098432 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.753458977s of 13.886870384s, submitted: 41
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:11.579768+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 14065664 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:12.579931+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 14065664 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:13.580067+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 14065664 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f87c8000/0x0/0x4ffc00000, data 0x2de1708/0x2ea3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1494831 data_alloc: 251658240 data_used: 31920128
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:14.580251+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 14065664 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:15.580424+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131710976 unmapped: 5898240 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:16.580654+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 133734400 unmapped: 3874816 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7899000/0x0/0x4ffc00000, data 0x3d11708/0x3dd3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:17.580778+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 132063232 unmapped: 5545984 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:18.580964+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 132063232 unmapped: 5545984 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1622323 data_alloc: 251658240 data_used: 33873920
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7872000/0x0/0x4ffc00000, data 0x3d38708/0x3dfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:19.581127+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 132063232 unmapped: 5545984 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7872000/0x0/0x4ffc00000, data 0x3d38708/0x3dfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:20.581263+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 132063232 unmapped: 5545984 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:21.581465+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 132063232 unmapped: 5545984 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.394704819s of 10.849593163s, submitted: 125
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:22.581655+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 132079616 unmapped: 5529600 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d172670000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d1720374a0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254d000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f786f000/0x0/0x4ffc00000, data 0x3d3b708/0x3dfd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:23.581814+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254d000 session 0x55d1703c65a0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 7643136 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1463639 data_alloc: 251658240 data_used: 29515776
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f842a000/0x0/0x4ffc00000, data 0x2b826a6/0x2c43000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:24.581986+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 7643136 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:25.582168+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 7643136 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:26.582330+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 7643136 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:27.582465+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 7643136 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:28.582650+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 7634944 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1463639 data_alloc: 251658240 data_used: 29515776
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d170047c20
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169c00 session 0x55d17259b680
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:29.582806+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 7634944 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254d000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254d000 session 0x55d172671860
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f842a000/0x0/0x4ffc00000, data 0x2b82683/0x2c42000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:30.583027+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:31.583235+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:32.583449+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:33.583637+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197707 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:34.583814+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:35.583958+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:36.584141+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:37.584320+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:38.584430+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197707 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:39.584753+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:40.585023+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:41.585679+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:42.585935+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:43.586281+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197707 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:44.587342+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:45.587794+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:46.588369+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:47.588547+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:48.588797+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197707 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:49.589609+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:50.589863+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:51.590164+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:52.590533+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:53.590787+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197707 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:54.591017+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:55.591246+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:56.591598+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 34.995807648s of 35.219715118s, submitted: 71
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1721d2000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d170401c20
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d170c97a40
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d1703c54a0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169c00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169c00 session 0x55d1703c4780
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:57.591823+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9e2d000/0x0/0x4ffc00000, data 0x177e6e5/0x183f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:58.592001+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1218652 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:59.592222+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:00.592428+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:01.592625+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9e2d000/0x0/0x4ffc00000, data 0x177e6e5/0x183f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:02.592767+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9e2d000/0x0/0x4ffc00000, data 0x177e6e5/0x183f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:03.593060+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254d000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254d000 session 0x55d1726e8960
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1218652 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:04.593219+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172494400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254c400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:05.593348+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:06.593573+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 22233088 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:07.593755+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:08.593946+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9e2d000/0x0/0x4ffc00000, data 0x177e6e5/0x183f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1222452 data_alloc: 234881024 data_used: 12705792
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:09.594116+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:10.594320+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:11.594546+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:12.594800+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:13.594957+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9e2d000/0x0/0x4ffc00000, data 0x177e6e5/0x183f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1222452 data_alloc: 234881024 data_used: 12705792
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:14.595141+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:15.595299+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:16.595511+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.421459198s of 19.482046127s, submitted: 29
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:17.595723+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117194752 unmapped: 20414464 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:18.595912+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119021568 unmapped: 18587648 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268948 data_alloc: 234881024 data_used: 13053952
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:19.596107+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120135680 unmapped: 17473536 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:20.596285+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120143872 unmapped: 17465344 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:21.596481+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120143872 unmapped: 17465344 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:22.596785+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120143872 unmapped: 17465344 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:23.596960+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120143872 unmapped: 17465344 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268948 data_alloc: 234881024 data_used: 13053952
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:24.597120+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:25.597303+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:26.597494+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:27.597653+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:28.597832+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268948 data_alloc: 234881024 data_used: 13053952
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:29.598029+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:30.598251+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:31.598496+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:32.598685+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:33.598921+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268948 data_alloc: 234881024 data_used: 13053952
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:34.599075+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:35.599228+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:36.599428+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:37.599616+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:38.599806+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268948 data_alloc: 234881024 data_used: 13053952
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:39.599969+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:40.600149+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:41.600347+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:42.600512+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:43.600768+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:44.600948+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268948 data_alloc: 234881024 data_used: 13053952
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:45.601239+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:46.601430+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:47.601636+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120168448 unmapped: 17440768 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:48.601789+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120168448 unmapped: 17440768 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 31.392076492s of 32.704330444s, submitted: 79
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:49.601914+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1262116 data_alloc: 234881024 data_used: 13058048
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 18792448 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:50.602038+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118939648 unmapped: 18669568 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:51.602202+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118939648 unmapped: 18669568 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f98a7000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:52.602371+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118939648 unmapped: 18669568 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:53.602573+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118939648 unmapped: 18669568 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:54.602802+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1262116 data_alloc: 234881024 data_used: 13058048
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f98a7000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118939648 unmapped: 18669568 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:55.603028+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118947840 unmapped: 18661376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:56.603219+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118947840 unmapped: 18661376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d1721fde00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:57.603402+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d1721fc3c0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1721fd680
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d17266f680
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169c00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169c00 session 0x55d17266fa40
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172164000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d17266e960
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172164000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d17266f0e0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119488512 unmapped: 25468928 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17266e000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d17266e780
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f98a7000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:58.603535+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119488512 unmapped: 25468928 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:59.603754+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1302588 data_alloc: 234881024 data_used: 13058048
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.978124619s of 10.595539093s, submitted: 169
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f5000/0x0/0x4ffc00000, data 0x21b66e5/0x2277000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 25534464 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:00.603935+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 25534464 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:01.604117+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 25534464 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f5000/0x0/0x4ffc00000, data 0x21b66e5/0x2277000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:02.604245+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 25534464 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:03.604386+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f5000/0x0/0x4ffc00000, data 0x21b66e5/0x2277000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 25534464 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169c00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169c00 session 0x55d17266fc20
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:04.604582+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1304100 data_alloc: 234881024 data_used: 13041664
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d17266f4a0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119439360 unmapped: 25518080 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:05.604765+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119439360 unmapped: 25518080 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:06.605006+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f5000/0x0/0x4ffc00000, data 0x21b66e5/0x2277000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119447552 unmapped: 25509888 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:07.605153+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d17266e1e0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119447552 unmapped: 25509888 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d173145e00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:08.605800+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172164000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 25501696 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:09.605976+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1303553 data_alloc: 234881024 data_used: 13041664
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 25501696 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f4000/0x0/0x4ffc00000, data 0x21b6708/0x2278000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:10.606170+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 25501696 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:11.606356+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 25501696 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:12.606485+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 25501696 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f4000/0x0/0x4ffc00000, data 0x21b6708/0x2278000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:13.606674+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:14.606811+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1314801 data_alloc: 234881024 data_used: 14536704
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f4000/0x0/0x4ffc00000, data 0x21b6708/0x2278000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:15.607006+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a800 session 0x55d172659680
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:16.607196+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:17.607355+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:18.607502+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:19.607668+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1335625 data_alloc: 234881024 data_used: 17674240
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f4000/0x0/0x4ffc00000, data 0x21b6708/0x2278000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f4000/0x0/0x4ffc00000, data 0x21b6708/0x2278000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:20.607822+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:21.607974+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:22.608155+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:23.608331+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:24.608488+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1335625 data_alloc: 234881024 data_used: 17674240
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 25.355880737s of 25.392654419s, submitted: 18
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:25.608683+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8a2c000/0x0/0x4ffc00000, data 0x2b76708/0x2c38000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 22511616 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:26.608933+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172162400 session 0x55d1730ca1e0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172166800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 22511616 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:27.609121+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122363904 unmapped: 22593536 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:28.609344+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122363904 unmapped: 22593536 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:29.609545+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1409087 data_alloc: 234881024 data_used: 17907712
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f899e000/0x0/0x4ffc00000, data 0x2c0c708/0x2cce000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,0,0,1,1])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122363904 unmapped: 22593536 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:30.609762+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122634240 unmapped: 22323200 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:31.609939+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122855424 unmapped: 22102016 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:32.610073+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122871808 unmapped: 22085632 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:33.610280+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122757120 unmapped: 22200320 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f891a000/0x0/0x4ffc00000, data 0x2c90708/0x2d52000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:34.610588+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1419099 data_alloc: 234881024 data_used: 17899520
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122822656 unmapped: 22134784 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 4.809408665s of 10.088632584s, submitted: 133
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:35.610761+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122904576 unmapped: 22052864 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:36.611029+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122912768 unmapped: 22044672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:37.611179+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f891a000/0x0/0x4ffc00000, data 0x2c90708/0x2d52000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 22036480 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:38.611293+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f88f9000/0x0/0x4ffc00000, data 0x2cb1708/0x2d73000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 22036480 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:39.611572+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1420491 data_alloc: 234881024 data_used: 17899520
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122929152 unmapped: 22028288 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:40.611891+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d1721d2000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d1703c61e0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122937344 unmapped: 22020096 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:41.612133+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17259b680
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:42.612298+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:43.612486+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:44.612636+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1273109 data_alloc: 234881024 data_used: 13041664
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f98a6000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:45.612766+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:46.612962+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:47.615603+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f98a6000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:48.619106+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.456459999s of 13.937705994s, submitted: 67
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d1703c6960
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c400 session 0x55d1725841e0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:49.620178+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1272745 data_alloc: 234881024 data_used: 13041664
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a800 session 0x55d1726701e0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:50.621207+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:51.621365+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:52.621727+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:53.623234+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:54.623448+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:55.623988+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:56.624524+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:57.624778+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:58.624957+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:59.625352+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:00.625649+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:01.625836+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:02.625964+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:03.626111+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:04.626302+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:05.626681+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:06.626932+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:07.627192+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:08.627437+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:09.627676+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:10.627920+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:11.628183+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:12.628445+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:13.628603+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:14.628851+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:15.629067+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:16.629335+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:17.629609+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:18.629848+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:19.630098+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:20.631352+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:21.632359+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:22.633282+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:23.634081+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:24.634875+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:25.635551+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:26.636091+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:27.636384+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:28.636771+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:29.637380+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:30.638063+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:31.638614+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:32.639199+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:33.639611+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:34.639792+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172164000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:35.639898+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 46.250705719s of 46.323696136s, submitted: 22
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d1703c4f00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d170d8dc20
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 25116672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a800 session 0x55d17263c780
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172164000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d170d8d2c0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172494400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d172895860
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:36.640439+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9eff000/0x0/0x4ffc00000, data 0x16ad683/0x176d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 25116672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:37.640820+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 25116672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9eff000/0x0/0x4ffc00000, data 0x16ad683/0x176d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:38.641202+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 25116672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:39.641536+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 25116672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224861 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:40.641667+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9eff000/0x0/0x4ffc00000, data 0x16ad683/0x176d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 25116672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254c400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c400 session 0x55d17263d680
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1730cbc20
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:41.641773+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119824384 unmapped: 25133056 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a800 session 0x55d170046780
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172164000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:42.641930+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d1721fc780
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118849536 unmapped: 26107904 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172494400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:43.642202+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118734848 unmapped: 26222592 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:44.642358+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 26214400 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1229754 data_alloc: 234881024 data_used: 12795904
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:45.642588+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 26214400 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:46.642954+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9eff000/0x0/0x4ffc00000, data 0x16ad683/0x176d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 26214400 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d1720954a0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.380515099s of 11.418202400s, submitted: 13
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d17263c960
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:47.643590+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17266fe00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118751232 unmapped: 26206208 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:48.643953+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118751232 unmapped: 26206208 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: mgrc ms_handle_reset ms_handle_reset con 0x55d171202c00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/2891176105
Jan 26 10:19:40 compute-1 ceph-osd[77632]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/2891176105,v1:192.168.122.100:6801/2891176105]
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: get_auth_request con 0x55d171202800 auth_method 0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: mgrc handle_mgr_configure stats_period=5
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:49.644151+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1216030 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172167c00 session 0x55d1725c45a0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17116ac00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:50.644309+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164400 session 0x55d172eff4a0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172507800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:51.644480+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:52.644748+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:53.644977+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:54.645137+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1216030 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:55.645601+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:56.645965+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:57.646115+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:58.646375+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:59.646643+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1216030 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:00.646917+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:01.647067+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172164400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.299061775s of 15.430803299s, submitted: 25
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:02.647207+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164400 session 0x55d172ee72c0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a800 session 0x55d1721fc780
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172164000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d1703c5e00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172494400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d1728943c0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172895860
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 36143104 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:03.647436+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 36143104 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:04.647666+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 36143104 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:05.648028+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 36143104 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:06.648221+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 36143104 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f955c000/0x0/0x4ffc00000, data 0x2050683/0x2110000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:07.648416+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f955c000/0x0/0x4ffc00000, data 0x2050683/0x2110000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a800 session 0x55d1703c70e0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 36143104 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172164000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d17259af00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:08.648613+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118349824 unmapped: 36126720 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172164400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164400 session 0x55d17259a000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:09.648804+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d1726701e0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f955a000/0x0/0x4ffc00000, data 0x20506b6/0x2112000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117383168 unmapped: 37093376 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301098 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:10.649036+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117170176 unmapped: 37306368 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:11.649199+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123084800 unmapped: 31391744 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d1703c14a0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17293a780
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:12.649409+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172993800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.107300758s of 10.256335258s, submitted: 30
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117915648 unmapped: 36560896 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172993800 session 0x55d16f7d4000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:13.649553+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:14.649749+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225121 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:15.649910+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:16.650116+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:17.650251+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:18.650425+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:19.650601+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225121 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:20.650776+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:21.650929+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:22.651100+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:23.651295+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:24.651461+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 36536320 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225121 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:25.651625+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 36536320 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:26.652009+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 36536320 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:27.652664+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 36536320 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:28.653162+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 36536320 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:29.653750+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225121 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:30.653899+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:31.654330+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:32.654969+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:33.655416+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:34.656020+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225121 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:35.656406+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:36.656842+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:37.657160+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:38.657343+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:39.657591+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225121 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:40.657968+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:41.658253+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:42.658661+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172528000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528000 session 0x55d17266fe00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172dc3c00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d173145c20
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172dc3c00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d173144960
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172659680
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:43.659053+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 30.658174515s of 30.887229919s, submitted: 38
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d1731441e0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172528000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528000 session 0x55d172095c20
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172993800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172993800 session 0x55d172897680
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172993800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172993800 session 0x55d170c97a40
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172894d20
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117514240 unmapped: 38584320 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:44.659287+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117514240 unmapped: 38584320 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1293569 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:45.659516+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117514240 unmapped: 38584320 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:46.659864+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d172894b40
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117514240 unmapped: 38584320 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172528000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528000 session 0x55d172896960
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:47.660063+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172dc3c00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d172095860
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172dc3c00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117555200 unmapped: 38543360 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d172ee70e0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:48.660341+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f962c000/0x0/0x4ffc00000, data 0x1f80683/0x2040000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117858304 unmapped: 38240256 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:49.660784+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117858304 unmapped: 38240256 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1320800 data_alloc: 234881024 data_used: 15335424
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:50.660990+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120201216 unmapped: 35897344 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:51.661223+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x1fa46b6/0x2066000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120201216 unmapped: 35897344 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:52.661455+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120201216 unmapped: 35897344 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x1fa46b6/0x2066000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:53.661891+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x1fa46b6/0x2066000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120201216 unmapped: 35897344 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:54.662209+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x1fa46b6/0x2066000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120201216 unmapped: 35897344 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1359864 data_alloc: 234881024 data_used: 21143552
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:55.662435+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120201216 unmapped: 35897344 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:56.662682+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120209408 unmapped: 35889152 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:57.662910+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120209408 unmapped: 35889152 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:58.663116+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120209408 unmapped: 35889152 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:59.663334+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x1fa46b6/0x2066000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120209408 unmapped: 35889152 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1359864 data_alloc: 234881024 data_used: 21143552
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:00.663485+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120209408 unmapped: 35889152 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.557689667s of 17.637289047s, submitted: 14
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:01.663644+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 32169984 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:02.663838+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124157952 unmapped: 31940608 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:03.664058+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f919b000/0x0/0x4ffc00000, data 0x240f6b6/0x24d1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124321792 unmapped: 31776768 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:04.664245+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124329984 unmapped: 31768576 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1417424 data_alloc: 234881024 data_used: 21848064
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:05.664371+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:06.664617+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:07.664895+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9169000/0x0/0x4ffc00000, data 0x24406b6/0x2502000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:08.665131+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:09.665327+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1420008 data_alloc: 234881024 data_used: 22011904
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:10.665454+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:11.665639+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:12.665864+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172036f00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d172585e00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:13.666041+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9167000/0x0/0x4ffc00000, data 0x24436b6/0x2505000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172097800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.894859314s of 12.680984497s, submitted: 52
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:14.666285+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172097800 session 0x55d172e80b40
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1237102 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:15.666479+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:16.666871+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:17.667124+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:18.667356+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:19.667556+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1237102 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:20.667769+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:21.668358+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:22.668597+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:23.668780+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:24.669038+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1237102 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:25.669212+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:26.669505+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:27.669745+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:28.669905+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:29.670677+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:30.670957+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1237102 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:31.672897+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:32.675825+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:33.676057+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:34.678385+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 38641664 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:35.678590+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1237102 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 38641664 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:36.679036+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 38641664 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:37.679292+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172ae9000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 23.971050262s of 24.083293915s, submitted: 28
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172ae9000 session 0x55d17311c1e0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172897a40
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172097800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172097800 session 0x55d172095c20
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d1726e83c0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172dc3c00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d172efed20
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117735424 unmapped: 46235648 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:38.679448+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117735424 unmapped: 46235648 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:39.679684+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 46227456 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:40.679933+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1353511 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9045000/0x0/0x4ffc00000, data 0x2567683/0x2627000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 46227456 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:41.680103+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 46227456 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:42.680284+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9045000/0x0/0x4ffc00000, data 0x2567683/0x2627000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 46219264 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:43.680444+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172163400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172163400 session 0x55d173144960
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172163400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172163400 session 0x55d1725845a0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 46219264 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:44.680652+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117760000 unmapped: 46211072 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:45.680812+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1353511 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9045000/0x0/0x4ffc00000, data 0x2567683/0x2627000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1721d3a40
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172097800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172097800 session 0x55d172e81e00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118374400 unmapped: 45596672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:46.681274+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172dc3c00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118382592 unmapped: 45588480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:47.681731+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123191296 unmapped: 40779776 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:48.682139+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128458752 unmapped: 35512320 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:49.682586+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9020000/0x0/0x4ffc00000, data 0x258b693/0x264c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128491520 unmapped: 35479552 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:50.682895+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1468282 data_alloc: 251658240 data_used: 28352512
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128491520 unmapped: 35479552 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:51.683332+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128499712 unmapped: 35471360 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:52.683817+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128499712 unmapped: 35471360 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:53.684212+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9020000/0x0/0x4ffc00000, data 0x258b693/0x264c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128532480 unmapped: 35438592 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:54.684494+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128532480 unmapped: 35438592 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:55.684647+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1468282 data_alloc: 251658240 data_used: 28352512
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128532480 unmapped: 35438592 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:56.684886+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9020000/0x0/0x4ffc00000, data 0x258b693/0x264c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128532480 unmapped: 35438592 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:57.685281+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128548864 unmapped: 35422208 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:58.685655+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172ae8c00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172ae8c00 session 0x55d170d8d0e0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172096800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d170d8dc20
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 20.590911865s of 20.736030579s, submitted: 31
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d170401e00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172096800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d172673680
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172097800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172097800 session 0x55d172672f00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172528c00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528c00 session 0x55d1726730e0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172c29800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172c29800 session 0x55d17034e1e0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172672d20
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172096800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d16f9e63c0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137723904 unmapped: 26247168 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:59.685904+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 25911296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:00.686058+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1635802 data_alloc: 251658240 data_used: 29327360
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 25911296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:01.686269+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7e20000/0x0/0x4ffc00000, data 0x3781705/0x3844000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 25911296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:02.686479+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 25911296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:03.686603+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172097800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 138067968 unmapped: 25903104 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:04.686762+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172097800 session 0x55d17259b680
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172528c00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17118a400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 136216576 unmapped: 27754496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:05.686940+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1632727 data_alloc: 251658240 data_used: 29339648
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137707520 unmapped: 26263552 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:06.687112+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7e04000/0x0/0x4ffc00000, data 0x37a5705/0x3868000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:07.687233+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:08.687416+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:09.687551+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7e04000/0x0/0x4ffc00000, data 0x37a5705/0x3868000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7e04000/0x0/0x4ffc00000, data 0x37a5705/0x3868000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:10.687714+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1686383 data_alloc: 251658240 data_used: 34185216
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:11.687834+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7e04000/0x0/0x4ffc00000, data 0x37a5705/0x3868000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:12.688042+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:13.688208+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139853824 unmapped: 24117248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:14.688393+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139853824 unmapped: 24117248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:15.688533+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7e04000/0x0/0x4ffc00000, data 0x37a5705/0x3868000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1686383 data_alloc: 251658240 data_used: 34185216
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.407573700s of 17.746696472s, submitted: 131
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 140615680 unmapped: 23355392 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:16.688724+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143704064 unmapped: 20267008 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:17.688908+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f737e000/0x0/0x4ffc00000, data 0x4223705/0x42e6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143548416 unmapped: 20422656 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:40.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:18.689058+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143171584 unmapped: 20799488 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:19.689208+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7304000/0x0/0x4ffc00000, data 0x42a5705/0x4368000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143171584 unmapped: 20799488 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:20.689343+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7304000/0x0/0x4ffc00000, data 0x42a5705/0x4368000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1784443 data_alloc: 251658240 data_used: 35086336
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:21.689451+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143171584 unmapped: 20799488 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:22.689606+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143179776 unmapped: 20791296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:23.689781+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143179776 unmapped: 20791296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7304000/0x0/0x4ffc00000, data 0x42a5705/0x4368000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:24.690009+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143212544 unmapped: 20758528 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:25.690187+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143212544 unmapped: 20758528 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1785315 data_alloc: 251658240 data_used: 35086336
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:26.690368+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f72e3000/0x0/0x4ffc00000, data 0x42c6705/0x4389000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143245312 unmapped: 20725760 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:27.690565+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143245312 unmapped: 20725760 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:28.690732+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143269888 unmapped: 20701184 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f72e3000/0x0/0x4ffc00000, data 0x42c6705/0x4389000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:29.690882+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143269888 unmapped: 20701184 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:30.691054+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143269888 unmapped: 20701184 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1786531 data_alloc: 251658240 data_used: 35164160
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.249229431s of 14.751939774s, submitted: 97
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:31.691242+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143269888 unmapped: 20701184 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f72e3000/0x0/0x4ffc00000, data 0x42c6705/0x4389000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:32.691368+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 142999552 unmapped: 20971520 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:33.691572+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143065088 unmapped: 20905984 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:34.692287+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143065088 unmapped: 20905984 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528c00 session 0x55d17311c780
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17118a400 session 0x55d16f7d4780
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:35.692979+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143065088 unmapped: 20905984 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1786215 data_alloc: 251658240 data_used: 35164160
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1704003c0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:36.693410+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137355264 unmapped: 26615808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:37.693922+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137355264 unmapped: 26615808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7f93000/0x0/0x4ffc00000, data 0x2fb8693/0x3079000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:38.694396+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137355264 unmapped: 26615808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:39.694558+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137371648 unmapped: 26599424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:40.695067+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137371648 unmapped: 26599424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1579255 data_alloc: 234881024 data_used: 26243072
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:41.695482+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137371648 unmapped: 26599424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.313569069s of 11.179004669s, submitted: 65
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d172ee61e0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d17311c000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:42.703118+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137371648 unmapped: 26599424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172096800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:43.703302+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f81e3000/0x0/0x4ffc00000, data 0x2fb8693/0x3079000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137388032 unmapped: 26583040 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:44.703665+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125648896 unmapped: 38322176 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:45.707252+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1275770 data_alloc: 234881024 data_used: 12288000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:46.707571+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [0,0,1])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d172eff4a0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:47.708069+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:48.708361+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:49.708651+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:50.708989+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1269126 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:51.709332+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:52.709669+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:53.709958+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:54.710117+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:55.710367+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1269126 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:56.710675+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:57.710922+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:58.711183+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:59.711347+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:00.711606+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1269126 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:01.711844+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:02.712089+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:03.712292+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:04.712591+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:05.712812+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1269126 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:06.713001+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:07.713221+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:08.713417+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:09.713573+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172097800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172097800 session 0x55d172efe3c0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1726725a0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172096800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d1726590e0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d172659a40
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172dc3c00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:10.713790+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 25.249382019s of 28.619909286s, submitted: 55
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1371866 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d172e80960
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172528c00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528c00 session 0x55d17311c3c0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172528c00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528c00 session 0x55d172585c20
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17259af00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:11.714001+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172096800
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d172894000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 38215680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:12.714198+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 38215680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9067000/0x0/0x4ffc00000, data 0x2135683/0x21f5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:13.714413+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 38215680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:14.902982+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 38215680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:15.903133+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 38215680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1352162 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:16.903309+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 38215680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9067000/0x0/0x4ffc00000, data 0x2135683/0x21f5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d172673e00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:17.903474+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125812736 unmapped: 38158336 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172dc3c00
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202000
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:18.903746+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125812736 unmapped: 38158336 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:19.903894+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:20.904081+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1432927 data_alloc: 234881024 data_used: 24039424
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:21.904233+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9066000/0x0/0x4ffc00000, data 0x21356a6/0x21f6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:22.904443+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:23.904884+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9066000/0x0/0x4ffc00000, data 0x21356a6/0x21f6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:24.905026+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:25.905197+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9066000/0x0/0x4ffc00000, data 0x21356a6/0x21f6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1432927 data_alloc: 234881024 data_used: 24039424
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:26.905440+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:27.905595+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:28.905735+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9066000/0x0/0x4ffc00000, data 0x21356a6/0x21f6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 18.371334076s of 18.521051407s, submitted: 16
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:29.905878+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 130768896 unmapped: 33202176 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8f50000/0x0/0x4ffc00000, data 0x22456a6/0x2306000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:30.906057+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 130768896 unmapped: 33202176 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1444671 data_alloc: 234881024 data_used: 24260608
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:31.906229+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:32.906431+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8f2d000/0x0/0x4ffc00000, data 0x22666a6/0x2327000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:33.906577+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8f2d000/0x0/0x4ffc00000, data 0x22666a6/0x2327000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:34.906725+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8f2d000/0x0/0x4ffc00000, data 0x22666a6/0x2327000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:35.906917+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1457483 data_alloc: 234881024 data_used: 24080384
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:36.907099+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:37.907240+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:38.907395+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8f2d000/0x0/0x4ffc00000, data 0x22666a6/0x2327000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:39.907572+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:40.907793+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131129344 unmapped: 32841728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1457483 data_alloc: 234881024 data_used: 24080384
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:41.907957+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131129344 unmapped: 32841728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:42.908111+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131129344 unmapped: 32841728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d172e810e0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202000 session 0x55d1726701e0
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:43.908288+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8f2d000/0x0/0x4ffc00000, data 0x22666a6/0x2327000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131129344 unmapped: 32841728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:44.908483+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131137536 unmapped: 32833536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.647443771s of 15.735019684s, submitted: 39
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:45.908751+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124911616 unmapped: 39059456 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280707 data_alloc: 234881024 data_used: 12181504
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e26a6/0x16a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:46.908918+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17311cd20
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:47.909091+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:48.909300+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:49.909451+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:50.909627+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:51.909846+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:52.910045+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:53.910218+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:54.910447+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:55.910671+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:56.911018+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:57.911211+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:58.911374+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:59.911783+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:00.911962+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:01.912104+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:02.912266+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:03.912465+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:04.912669+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:05.912782+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:06.913062+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:07.913241+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:08.913419+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 26 10:19:40 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2332631685' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:09.913576+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:10.913767+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:11.913944+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:12.914105+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:13.914242+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:14.914397+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:15.914541+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:16.914790+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:17.915022+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:18.915201+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:19.915377+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:20.915561+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:21.915743+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:22.915903+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:23.916031+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:24.916209+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:25.916356+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:26.916527+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:27.916630+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:28.916761+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:29.916888+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:30.917055+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:31.917190+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:32.917333+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:33.917517+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:34.917666+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:35.917780+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:36.917991+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:37.918162+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:38.918373+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:39.918551+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:40.918779+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:41.918972+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:42.919238+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:43.919817+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:44.920515+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:45.920823+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:46.922768+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:47.923927+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:48.924318+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:49.924834+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:50.925489+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:51.925857+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:52.926321+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:53.926868+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:54.927001+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:55.927169+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:56.927415+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:57.927591+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:58.927756+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:59.927914+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:00.928074+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:01.928191+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:02.928359+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:03.928513+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:04.928649+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:05.928778+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:19:40 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:19:40 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:06.928925+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: do_command 'config diff' '{prefix=config diff}'
Jan 26 10:19:40 compute-1 ceph-osd[77632]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 26 10:19:40 compute-1 ceph-osd[77632]: do_command 'config show' '{prefix=config show}'
Jan 26 10:19:40 compute-1 ceph-osd[77632]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125296640 unmapped: 38674432 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: do_command 'counter dump' '{prefix=counter dump}'
Jan 26 10:19:40 compute-1 ceph-osd[77632]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 26 10:19:40 compute-1 ceph-osd[77632]: do_command 'counter schema' '{prefix=counter schema}'
Jan 26 10:19:40 compute-1 ceph-osd[77632]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:07.929126+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124674048 unmapped: 39297024 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:19:40 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:08.929293+0000)
Jan 26 10:19:40 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:19:40 compute-1 ceph-osd[77632]: do_command 'log dump' '{prefix=log dump}'
Jan 26 10:19:40 compute-1 nova_compute[226322]: 2026-01-26 10:19:40.320 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:19:40 compute-1 nova_compute[226322]: 2026-01-26 10:19:40.428 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:19:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:40.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:40 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 26 10:19:40 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3949276204' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 26 10:19:40 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1020071750' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 26 10:19:40 compute-1 ceph-mon[80107]: from='client.26023 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:40 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1248639188' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:19:40 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2545354252' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 26 10:19:40 compute-1 ceph-mon[80107]: from='client.16785 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:40 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1848180643' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 26 10:19:40 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/136388260' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 26 10:19:40 compute-1 ceph-mon[80107]: from='client.26044 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:40 compute-1 ceph-mon[80107]: from='client.26423 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:40 compute-1 ceph-mon[80107]: from='client.16803 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:40 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1794654331' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 26 10:19:40 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2332631685' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 26 10:19:40 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2902301858' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 26 10:19:40 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1589349312' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 26 10:19:40 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 26 10:19:40 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/461349660' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 26 10:19:41 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Jan 26 10:19:41 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1644447967' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 26 10:19:41 compute-1 crontab[241395]: (root) LIST (root)
Jan 26 10:19:41 compute-1 nova_compute[226322]: 2026-01-26 10:19:41.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:19:41 compute-1 nova_compute[226322]: 2026-01-26 10:19:41.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:19:41 compute-1 nova_compute[226322]: 2026-01-26 10:19:41.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:19:41 compute-1 nova_compute[226322]: 2026-01-26 10:19:41.711 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 10:19:41 compute-1 nova_compute[226322]: 2026-01-26 10:19:41.711 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:19:41 compute-1 ceph-mon[80107]: from='client.26059 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:41 compute-1 ceph-mon[80107]: from='client.26435 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:41 compute-1 ceph-mon[80107]: from='client.16821 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:41 compute-1 ceph-mon[80107]: from='client.26080 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:41 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3949276204' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 26 10:19:41 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1634668375' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 26 10:19:41 compute-1 ceph-mon[80107]: from='client.26453 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:41 compute-1 ceph-mon[80107]: from='client.16839 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:41 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1900355160' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 26 10:19:41 compute-1 ceph-mon[80107]: pgmap v1126: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:19:41 compute-1 ceph-mon[80107]: from='client.26095 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:41 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/461349660' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 26 10:19:41 compute-1 ceph-mon[80107]: from='client.26471 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:41 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2292204840' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 26 10:19:41 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3806127034' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 26 10:19:41 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1498955387' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:19:41 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1644447967' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 26 10:19:41 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3632884701' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 26 10:19:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:42.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:19:42 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4041690340' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:19:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Jan 26 10:19:42 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1604141855' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 26 10:19:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:19:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:42.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:19:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:19:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Jan 26 10:19:42 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/237639938' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:19:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:19:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:19:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.16854 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.26110 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.26486 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.16866 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2814067036' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.26504 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.26140 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/4142164008' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.16890 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/4041690340' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1604141855' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1090076573' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/801966427' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/237639938' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Jan 26 10:19:43 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/358209871' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Jan 26 10:19:43 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1790643785' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Jan 26 10:19:43 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/499056126' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 26 10:19:43 compute-1 nova_compute[226322]: 2026-01-26 10:19:43.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:19:43 compute-1 nova_compute[226322]: 2026-01-26 10:19:43.706 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:19:43 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Jan 26 10:19:43 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3776534711' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.26516 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.26173 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.16908 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.26534 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.26194 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.16920 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: pgmap v1127: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.26218 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1665467363' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/358209871' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3411450156' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2000606720' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1790643785' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/4117579945' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3326537678' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/499056126' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 26 10:19:43 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3776534711' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 26 10:19:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:44.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:44 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Jan 26 10:19:44 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2654797244' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 26 10:19:44 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Jan 26 10:19:44 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4023938435' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 26 10:19:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:19:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:44.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:19:44 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Jan 26 10:19:44 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4005334717' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 26 10:19:44 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Jan 26 10:19:44 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1440409755' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 26 10:19:44 compute-1 systemd[1]: Starting Hostname Service...
Jan 26 10:19:45 compute-1 systemd[1]: Started Hostname Service.
Jan 26 10:19:45 compute-1 ceph-mon[80107]: from='client.16932 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:45 compute-1 ceph-mon[80107]: from='client.26233 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:45 compute-1 ceph-mon[80107]: from='client.26248 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:45 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/525525554' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 26 10:19:45 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3198617286' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 26 10:19:45 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3086565525' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 26 10:19:45 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2654797244' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 26 10:19:45 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/4023938435' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 26 10:19:45 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2172289278' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 26 10:19:45 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1315709763' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 26 10:19:45 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3299340595' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 26 10:19:45 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2192620793' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 26 10:19:45 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/4005334717' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 26 10:19:45 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1440409755' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 26 10:19:45 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2836902024' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 26 10:19:45 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3702236953' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 26 10:19:45 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2966148949' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 26 10:19:45 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Jan 26 10:19:45 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1985737188' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 26 10:19:45 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Jan 26 10:19:45 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3473383718' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 26 10:19:45 compute-1 nova_compute[226322]: 2026-01-26 10:19:45.334 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:19:45 compute-1 nova_compute[226322]: 2026-01-26 10:19:45.428 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:19:45 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Jan 26 10:19:45 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3462755018' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 26 10:19:45 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Jan 26 10:19:45 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2009624782' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 26 10:19:45 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Jan 26 10:19:45 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1215151384' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 26 10:19:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:46.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:46 compute-1 sudo[242006]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:19:46 compute-1 sudo[242006]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:19:46 compute-1 sudo[242006]: pam_unix(sudo:session): session closed for user root
Jan 26 10:19:46 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Jan 26 10:19:46 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/782210037' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 26 10:19:46 compute-1 ceph-mon[80107]: pgmap v1128: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:19:46 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3843245282' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 26 10:19:46 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2724956292' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 26 10:19:46 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1985737188' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 26 10:19:46 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3473383718' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 26 10:19:46 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1745987047' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 26 10:19:46 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1702451587' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 26 10:19:46 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3799741745' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 26 10:19:46 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3462755018' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 26 10:19:46 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2009624782' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 26 10:19:46 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1215151384' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 26 10:19:46 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2500859101' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 26 10:19:46 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/376801596' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 26 10:19:46 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1043531508' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 26 10:19:46 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Jan 26 10:19:46 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1380287593' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 26 10:19:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:46.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:47 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1338422426' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 26 10:19:47 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/782210037' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 26 10:19:47 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1380287593' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 26 10:19:47 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3007954583' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 26 10:19:47 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3704633609' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 26 10:19:47 compute-1 ceph-mon[80107]: from='client.26678 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:47 compute-1 ceph-mon[80107]: from='client.26687 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:47 compute-1 ceph-mon[80107]: from='client.26693 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:47 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2936063970' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 26 10:19:47 compute-1 ceph-mon[80107]: from='client.17073 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:47 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3805920001' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 26 10:19:47 compute-1 ceph-mon[80107]: from='client.17079 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:47 compute-1 ceph-mon[80107]: from='client.26705 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:47 compute-1 ceph-mon[80107]: pgmap v1129: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:19:47 compute-1 ceph-mon[80107]: from='client.17085 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:47 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3333168501' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 26 10:19:47 compute-1 ceph-mon[80107]: from='client.26711 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:47 compute-1 ceph-mon[80107]: from='client.26377 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:47 compute-1 podman[242246]: 2026-01-26 10:19:47.349317859 +0000 UTC m=+0.122305889 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 10:19:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:19:47 compute-1 nova_compute[226322]: 2026-01-26 10:19:47.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:19:47 compute-1 nova_compute[226322]: 2026-01-26 10:19:47.714 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:19:47 compute-1 nova_compute[226322]: 2026-01-26 10:19:47.715 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:19:47 compute-1 nova_compute[226322]: 2026-01-26 10:19:47.715 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:19:47 compute-1 nova_compute[226322]: 2026-01-26 10:19:47.715 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:19:47 compute-1 nova_compute[226322]: 2026-01-26 10:19:47.715 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:19:47 compute-1 sshd-session[242232]: Invalid user admin from 152.42.136.110 port 54902
Jan 26 10:19:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:19:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:19:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:19:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:19:48 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Jan 26 10:19:48 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/578049670' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 26 10:19:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:19:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:48.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:19:48 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:19:48 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2480028327' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:19:48 compute-1 nova_compute[226322]: 2026-01-26 10:19:48.184 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:19:48 compute-1 sshd-session[242232]: Connection closed by invalid user admin 152.42.136.110 port 54902 [preauth]
Jan 26 10:19:48 compute-1 nova_compute[226322]: 2026-01-26 10:19:48.370 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:19:48 compute-1 nova_compute[226322]: 2026-01-26 10:19:48.372 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4610MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:19:48 compute-1 nova_compute[226322]: 2026-01-26 10:19:48.373 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:19:48 compute-1 nova_compute[226322]: 2026-01-26 10:19:48.373 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:19:48 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Jan 26 10:19:48 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/734800606' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 26 10:19:48 compute-1 nova_compute[226322]: 2026-01-26 10:19:48.480 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:19:48 compute-1 nova_compute[226322]: 2026-01-26 10:19:48.480 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:19:48 compute-1 nova_compute[226322]: 2026-01-26 10:19:48.507 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:19:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:48.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:48 compute-1 ceph-mon[80107]: from='client.26723 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:48 compute-1 ceph-mon[80107]: from='client.17100 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:48 compute-1 ceph-mon[80107]: from='client.26395 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:48 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2801142467' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 26 10:19:48 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1296280456' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 26 10:19:48 compute-1 ceph-mon[80107]: from='client.26401 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:48 compute-1 ceph-mon[80107]: from='client.26413 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:48 compute-1 ceph-mon[80107]: from='client.17118 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:48 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/578049670' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 26 10:19:48 compute-1 ceph-mon[80107]: from='client.26765 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:48 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/4035918887' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 26 10:19:48 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2480028327' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:19:48 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Jan 26 10:19:48 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/470784433' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 26 10:19:48 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:19:48 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1554868477' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:19:49 compute-1 nova_compute[226322]: 2026-01-26 10:19:49.008 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:19:49 compute-1 nova_compute[226322]: 2026-01-26 10:19:49.014 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:19:49 compute-1 nova_compute[226322]: 2026-01-26 10:19:49.028 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:19:49 compute-1 nova_compute[226322]: 2026-01-26 10:19:49.029 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:19:49 compute-1 nova_compute[226322]: 2026-01-26 10:19:49.030 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.657s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:19:49 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Jan 26 10:19:49 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1549787676' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='client.26425 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='client.17136 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2429009617' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/734800606' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='client.26783 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='client.26449 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/4149818100' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:19:49 compute-1 ceph-mon[80107]: pgmap v1130: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/470784433' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1554868477' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='client.26810 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='client.26467 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3732424704' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1549787676' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1249600633' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1515469255' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2867308421' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 10:19:49 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 10:19:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:50.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:50 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 10:19:50 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 10:19:50 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 10:19:50 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 10:19:50 compute-1 nova_compute[226322]: 2026-01-26 10:19:50.334 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:19:50 compute-1 nova_compute[226322]: 2026-01-26 10:19:50.429 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:19:50 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 10:19:50 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 10:19:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:50.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:50 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 10:19:50 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 10:19:50 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Jan 26 10:19:50 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/42955068' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 26 10:19:50 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 10:19:50 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 10:19:50 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 10:19:50 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='client.17148 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='client.26825 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='client.17157 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='client.26506 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/42955068' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 10:19:51 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 10:19:51 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Jan 26 10:19:51 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3491128388' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 26 10:19:51 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Jan 26 10:19:51 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/567662507' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 26 10:19:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:52.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0)
Jan 26 10:19:52 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1194734970' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 26 10:19:52 compute-1 ceph-mon[80107]: from='client.26545 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:19:52 compute-1 ceph-mon[80107]: pgmap v1131: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:19:52 compute-1 ceph-mon[80107]: from='client.26924 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:52 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3052496530' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 26 10:19:52 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2244219533' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 26 10:19:52 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3491128388' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 26 10:19:52 compute-1 ceph-mon[80107]: from='client.17280 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:52 compute-1 ceph-mon[80107]: from='client.26629 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:52 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/567662507' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 26 10:19:52 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/4037000624' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 26 10:19:52 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2365034706' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 26 10:19:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:52.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:19:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0)
Jan 26 10:19:52 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1873884357' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Jan 26 10:19:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:19:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:19:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:19:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:19:53 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1194734970' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 26 10:19:53 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/736816567' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 26 10:19:53 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3971769629' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 26 10:19:53 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1873884357' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Jan 26 10:19:53 compute-1 ceph-mon[80107]: pgmap v1132: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:19:53 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/344258398' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 26 10:19:53 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/851279560' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 26 10:19:53 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0)
Jan 26 10:19:53 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2117799381' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Jan 26 10:19:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:19:53.943 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:19:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:19:53.943 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:19:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:19:53.943 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:19:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:54.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:54 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump"} v 0)
Jan 26 10:19:54 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3853829658' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Jan 26 10:19:54 compute-1 ceph-mon[80107]: from='client.26975 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:54 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3217068959' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Jan 26 10:19:54 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3797688383' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Jan 26 10:19:54 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2117799381' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Jan 26 10:19:54 compute-1 ceph-mon[80107]: from='client.26680 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:54 compute-1 ceph-mon[80107]: from='client.17325 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:54 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3853829658' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Jan 26 10:19:54 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1444274541' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Jan 26 10:19:54 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1877010172' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Jan 26 10:19:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:54.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:55 compute-1 nova_compute[226322]: 2026-01-26 10:19:55.336 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:19:55 compute-1 nova_compute[226322]: 2026-01-26 10:19:55.431 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:19:55 compute-1 ceph-mon[80107]: from='client.27002 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:55 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3658239915' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Jan 26 10:19:55 compute-1 ceph-mon[80107]: pgmap v1133: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:19:55 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1431985574' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Jan 26 10:19:55 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/898803205' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Jan 26 10:19:55 compute-1 ceph-mon[80107]: from='client.26713 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:55 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/851808899' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Jan 26 10:19:55 compute-1 ovs-appctl[243877]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 26 10:19:55 compute-1 ovs-appctl[243883]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 26 10:19:55 compute-1 ovs-appctl[243900]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 26 10:19:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:56.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:56 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd dump"} v 0)
Jan 26 10:19:56 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3198355035' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Jan 26 10:19:56 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd numa-status"} v 0)
Jan 26 10:19:56 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1046191194' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Jan 26 10:19:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:19:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:56.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:19:56 compute-1 ceph-mon[80107]: from='client.17346 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:56 compute-1 ceph-mon[80107]: from='client.27017 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:56 compute-1 ceph-mon[80107]: from='client.27029 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:56 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3896528404' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Jan 26 10:19:56 compute-1 ceph-mon[80107]: from='client.26731 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:56 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3198355035' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Jan 26 10:19:56 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1046191194' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Jan 26 10:19:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail"} v 0)
Jan 26 10:19:57 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2271613595' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Jan 26 10:19:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:19:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd stat"} v 0)
Jan 26 10:19:57 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3463540812' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Jan 26 10:19:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:19:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:19:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:19:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:19:58 compute-1 ceph-mon[80107]: from='client.17364 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:58 compute-1 ceph-mon[80107]: from='client.26740 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:58 compute-1 ceph-mon[80107]: from='client.17370 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3109192064' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Jan 26 10:19:58 compute-1 ceph-mon[80107]: from='client.27059 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:58 compute-1 ceph-mon[80107]: pgmap v1134: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:19:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2501247436' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Jan 26 10:19:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2516004710' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Jan 26 10:19:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2966967997' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Jan 26 10:19:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2271613595' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Jan 26 10:19:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:19:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:58.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:19:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:19:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:19:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:58.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:19:59 compute-1 ceph-mon[80107]: from='client.27068 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:59 compute-1 ceph-mon[80107]: from='client.26761 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:59 compute-1 ceph-mon[80107]: from='client.17397 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:59 compute-1 ceph-mon[80107]: from='client.26773 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:19:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3463540812' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Jan 26 10:19:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/1353136906' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:19:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/1353136906' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:19:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3548181160' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Jan 26 10:19:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/4077096930' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Jan 26 10:19:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/88667544' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Jan 26 10:19:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2874472771' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Jan 26 10:19:59 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Jan 26 10:19:59 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3299309898' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 26 10:19:59 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0)
Jan 26 10:19:59 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1487676632' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Jan 26 10:19:59 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump", "format": "json-pretty"} v 0)
Jan 26 10:19:59 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2011035917' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Jan 26 10:20:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:00.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:00 compute-1 ceph-mon[80107]: from='client.17406 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:20:00 compute-1 ceph-mon[80107]: from='client.27107 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:20:00 compute-1 ceph-mon[80107]: from='client.27131 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:20:00 compute-1 ceph-mon[80107]: pgmap v1135: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:20:00 compute-1 ceph-mon[80107]: from='client.26812 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:20:00 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3299309898' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 26 10:20:00 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1487676632' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Jan 26 10:20:00 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3634761303' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 26 10:20:00 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2011035917' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Jan 26 10:20:00 compute-1 ceph-mon[80107]: Health detail: HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore; 1 failed cephadm daemon(s)
Jan 26 10:20:00 compute-1 ceph-mon[80107]: [WRN] BLUESTORE_SLOW_OP_ALERT: 1 OSD(s) experiencing slow operations in BlueStore
Jan 26 10:20:00 compute-1 ceph-mon[80107]:      osd.2 observed slow operation indications in BlueStore
Jan 26 10:20:00 compute-1 ceph-mon[80107]: [WRN] CEPHADM_FAILED_DAEMON: 1 failed cephadm daemon(s)
Jan 26 10:20:00 compute-1 ceph-mon[80107]:     daemon nfs.cephfs.1.0.compute-2.najyrz on compute-2 is in error state
Jan 26 10:20:00 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0)
Jan 26 10:20:00 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2107503081' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Jan 26 10:20:00 compute-1 nova_compute[226322]: 2026-01-26 10:20:00.337 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:20:00 compute-1 nova_compute[226322]: 2026-01-26 10:20:00.433 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:20:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:00.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:00 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0)
Jan 26 10:20:00 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1400915670' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 26 10:20:01 compute-1 ceph-mon[80107]: from='client.17445 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:20:01 compute-1 ceph-mon[80107]: from='client.26821 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:20:01 compute-1 ceph-mon[80107]: from='client.17454 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:20:01 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/410198663' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 26 10:20:01 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2107503081' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Jan 26 10:20:01 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1050308051' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Jan 26 10:20:01 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3758279747' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Jan 26 10:20:01 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1400915670' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 26 10:20:01 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json-pretty"} v 0)
Jan 26 10:20:01 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3955527617' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Jan 26 10:20:01 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump", "format": "json-pretty"} v 0)
Jan 26 10:20:01 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4171722842' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Jan 26 10:20:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:02.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls", "format": "json-pretty"} v 0)
Jan 26 10:20:02 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/657391317' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Jan 26 10:20:02 compute-1 ceph-mon[80107]: from='client.27173 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:02 compute-1 ceph-mon[80107]: pgmap v1136: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:20:02 compute-1 ceph-mon[80107]: from='client.26857 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:02 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1317730798' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Jan 26 10:20:02 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3955527617' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Jan 26 10:20:02 compute-1 ceph-mon[80107]: from='client.17484 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:02 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1848367617' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 26 10:20:02 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/4171722842' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Jan 26 10:20:02 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2041293799' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 26 10:20:02 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/626783217' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Jan 26 10:20:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:20:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:02.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:20:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:20:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:20:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:20:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:20:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:20:03 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mds stat", "format": "json-pretty"} v 0)
Jan 26 10:20:03 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2158557761' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Jan 26 10:20:03 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/657391317' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Jan 26 10:20:03 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1518384862' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Jan 26 10:20:03 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/465928881' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Jan 26 10:20:03 compute-1 ceph-mon[80107]: from='client.27224 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:03 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2371224661' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Jan 26 10:20:03 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2324653960' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Jan 26 10:20:03 compute-1 ceph-mon[80107]: pgmap v1137: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:20:03 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2158557761' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Jan 26 10:20:03 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0)
Jan 26 10:20:03 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1688964418' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Jan 26 10:20:03 compute-1 podman[245646]: 2026-01-26 10:20:03.911657211 +0000 UTC m=+0.091191574 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 26 10:20:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:04.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:04 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0)
Jan 26 10:20:04 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1535477781' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Jan 26 10:20:04 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2436935829' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Jan 26 10:20:04 compute-1 ceph-mon[80107]: from='client.26893 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:04 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1688964418' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Jan 26 10:20:04 compute-1 ceph-mon[80107]: from='client.17523 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:20:04 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/742367659' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Jan 26 10:20:04 compute-1 ceph-mon[80107]: from='client.27248 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:04 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3169316236' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Jan 26 10:20:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:04.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:05 compute-1 nova_compute[226322]: 2026-01-26 10:20:05.339 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:20:05 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0)
Jan 26 10:20:05 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2906224273' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Jan 26 10:20:05 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3772322359' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Jan 26 10:20:05 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1535477781' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Jan 26 10:20:05 compute-1 ceph-mon[80107]: from='client.26914 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:05 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2529800502' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Jan 26 10:20:05 compute-1 ceph-mon[80107]: from='client.27272 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:05 compute-1 ceph-mon[80107]: pgmap v1138: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:20:05 compute-1 ceph-mon[80107]: from='client.27281 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:05 compute-1 ceph-mon[80107]: from='client.17541 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:05 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1774929671' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Jan 26 10:20:05 compute-1 nova_compute[226322]: 2026-01-26 10:20:05.435 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:20:05 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd numa-status", "format": "json-pretty"} v 0)
Jan 26 10:20:05 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2459420877' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Jan 26 10:20:06 compute-1 virtqemud[225791]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 26 10:20:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:06.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:06 compute-1 sudo[246090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:20:06 compute-1 sudo[246090]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:20:06 compute-1 sudo[246090]: pam_unix(sudo:session): session closed for user root
Jan 26 10:20:06 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2906224273' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Jan 26 10:20:06 compute-1 ceph-mon[80107]: from='client.26932 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:06 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2033412290' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Jan 26 10:20:06 compute-1 ceph-mon[80107]: from='client.26938 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:06 compute-1 ceph-mon[80107]: from='client.17565 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:06 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2459420877' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Jan 26 10:20:06 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1227439692' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Jan 26 10:20:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:20:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:06.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:20:06 compute-1 systemd[1]: Starting Time & Date Service...
Jan 26 10:20:06 compute-1 systemd[1]: Started Time & Date Service.
Jan 26 10:20:06 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0)
Jan 26 10:20:06 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/634901022' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 26 10:20:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0)
Jan 26 10:20:07 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3521053892' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Jan 26 10:20:07 compute-1 ceph-mon[80107]: from='client.17571 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:07 compute-1 ceph-mon[80107]: from='client.27308 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:07 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3170118741' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Jan 26 10:20:07 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3956985062' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Jan 26 10:20:07 compute-1 ceph-mon[80107]: from='client.27314 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:07 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2440398676' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Jan 26 10:20:07 compute-1 ceph-mon[80107]: pgmap v1139: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:20:07 compute-1 ceph-mon[80107]: from='client.26965 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:07 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/634901022' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 26 10:20:07 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3521053892' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Jan 26 10:20:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:20:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:20:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:20:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:20:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:20:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:08.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:20:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:08.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:20:08 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0)
Jan 26 10:20:08 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3007392388' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Jan 26 10:20:08 compute-1 ceph-mon[80107]: from='client.17592 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:08 compute-1 ceph-mon[80107]: from='client.26968 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:08 compute-1 ceph-mon[80107]: from='client.17604 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:08 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/285396718' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 26 10:20:08 compute-1 ceph-mon[80107]: from='client.27338 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:08 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2581328815' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 26 10:20:08 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/139676970' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Jan 26 10:20:08 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3068562462' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Jan 26 10:20:08 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3007392388' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Jan 26 10:20:09 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0)
Jan 26 10:20:09 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4192985596' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Jan 26 10:20:09 compute-1 ceph-mon[80107]: from='client.27350 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:09 compute-1 ceph-mon[80107]: from='client.27004 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:09 compute-1 ceph-mon[80107]: from='client.17631 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:09 compute-1 ceph-mon[80107]: from='client.27010 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:09 compute-1 ceph-mon[80107]: pgmap v1140: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:20:09 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/4192985596' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Jan 26 10:20:09 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3055550707' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Jan 26 10:20:09 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2842003281' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Jan 26 10:20:09 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3881451233' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Jan 26 10:20:10 compute-1 sudo[246441]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:20:10 compute-1 sudo[246441]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:20:10 compute-1 sudo[246441]: pam_unix(sudo:session): session closed for user root
Jan 26 10:20:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:20:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:10.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:20:10 compute-1 sudo[246466]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:20:10 compute-1 sudo[246466]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:20:10 compute-1 nova_compute[226322]: 2026-01-26 10:20:10.340 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:20:10 compute-1 nova_compute[226322]: 2026-01-26 10:20:10.436 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:20:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:20:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:10.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:20:10 compute-1 sudo[246466]: pam_unix(sudo:session): session closed for user root
Jan 26 10:20:10 compute-1 ceph-mon[80107]: from='client.17643 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:20:10 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3473316423' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Jan 26 10:20:10 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:20:10 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:20:11 compute-1 ceph-mon[80107]: pgmap v1141: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:20:11 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:20:11 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:20:11 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:20:11 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:20:11 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:20:11 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:20:11 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:20:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:20:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:12.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:20:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:12.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:20:12 compute-1 ceph-mon[80107]: pgmap v1142: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:20:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:20:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:20:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:20:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:20:13 compute-1 sshd-session[246523]: Invalid user admin from 178.62.249.31 port 58214
Jan 26 10:20:13 compute-1 sshd-session[246523]: Connection closed by invalid user admin 178.62.249.31 port 58214 [preauth]
Jan 26 10:20:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:14.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:20:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:14.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:20:15 compute-1 ceph-mon[80107]: pgmap v1143: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:20:15 compute-1 nova_compute[226322]: 2026-01-26 10:20:15.392 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:20:15 compute-1 nova_compute[226322]: 2026-01-26 10:20:15.437 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:20:15 compute-1 sshd-session[246526]: Invalid user centos from 104.248.198.71 port 55842
Jan 26 10:20:15 compute-1 sshd-session[246526]: Connection closed by invalid user centos 104.248.198.71 port 55842 [preauth]
Jan 26 10:20:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:16.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:20:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:16.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:20:17 compute-1 ceph-mon[80107]: pgmap v1144: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:20:17 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:20:17 compute-1 sudo[246530]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:20:17 compute-1 sudo[246530]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:20:17 compute-1 sudo[246530]: pam_unix(sudo:session): session closed for user root
Jan 26 10:20:17 compute-1 podman[246554]: 2026-01-26 10:20:17.969346557 +0000 UTC m=+0.114674961 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 10:20:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:20:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:20:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:20:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:20:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:18.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:18.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:19 compute-1 ceph-mon[80107]: pgmap v1145: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:20:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:20:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:20:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:20.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:20 compute-1 nova_compute[226322]: 2026-01-26 10:20:20.393 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:20:20 compute-1 nova_compute[226322]: 2026-01-26 10:20:20.438 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:20:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:20.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:20 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:20:20 compute-1 ceph-mon[80107]: pgmap v1146: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:20:21 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 10:20:21 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2400.1 total, 600.0 interval
                                           Cumulative writes: 10K writes, 40K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s
                                           Cumulative WAL: 10K writes, 2923 syncs, 3.64 writes per sync, written: 0.03 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2426 writes, 8573 keys, 2426 commit groups, 1.0 writes per commit group, ingest: 9.03 MB, 0.02 MB/s
                                           Interval WAL: 2426 writes, 1011 syncs, 2.40 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 26 10:20:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:20:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:22.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:20:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:20:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:22.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:20:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:20:22 compute-1 ceph-mon[80107]: pgmap v1147: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:20:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:20:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:20:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:20:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:20:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:20:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:24.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:20:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:24.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:24 compute-1 ceph-mon[80107]: pgmap v1148: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:20:25 compute-1 nova_compute[226322]: 2026-01-26 10:20:25.434 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:20:25 compute-1 nova_compute[226322]: 2026-01-26 10:20:25.439 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:20:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:26.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:26 compute-1 sudo[246589]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:20:26 compute-1 sudo[246589]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:20:26 compute-1 sudo[246589]: pam_unix(sudo:session): session closed for user root
Jan 26 10:20:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:26.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:26 compute-1 ceph-mon[80107]: pgmap v1149: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:20:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:20:27 compute-1 sshd-session[246614]: Invalid user admin from 152.42.136.110 port 59762
Jan 26 10:20:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:20:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:20:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:20:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:20:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:20:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:28.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:20:28 compute-1 sshd-session[246614]: Connection closed by invalid user admin 152.42.136.110 port 59762 [preauth]
Jan 26 10:20:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:20:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:28.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:20:28 compute-1 ceph-mon[80107]: pgmap v1150: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:20:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:30.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:30 compute-1 nova_compute[226322]: 2026-01-26 10:20:30.435 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:20:30 compute-1 nova_compute[226322]: 2026-01-26 10:20:30.440 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:20:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:30.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:30 compute-1 ceph-mon[80107]: pgmap v1151: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:20:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:20:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:32.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:20:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:32.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:20:32 compute-1 ceph-mon[80107]: pgmap v1152: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:20:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:20:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:20:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:20:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:20:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:20:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:20:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:34.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:20:34 compute-1 podman[246620]: 2026-01-26 10:20:34.291662875 +0000 UTC m=+0.069017402 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 10:20:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:34.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:34 compute-1 ceph-mon[80107]: pgmap v1153: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:20:35 compute-1 nova_compute[226322]: 2026-01-26 10:20:35.438 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:20:35 compute-1 nova_compute[226322]: 2026-01-26 10:20:35.441 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:20:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:36.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:36 compute-1 ceph-mon[80107]: pgmap v1154: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:20:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:36.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:36 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 26 10:20:36 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 10:20:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:20:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:20:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:20:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:20:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:20:38 compute-1 nova_compute[226322]: 2026-01-26 10:20:38.030 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:20:38 compute-1 nova_compute[226322]: 2026-01-26 10:20:38.031 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:20:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:20:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:38.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:20:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:38.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:38 compute-1 ceph-mon[80107]: pgmap v1155: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:20:38 compute-1 nova_compute[226322]: 2026-01-26 10:20:38.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:20:38 compute-1 nova_compute[226322]: 2026-01-26 10:20:38.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:20:38 compute-1 nova_compute[226322]: 2026-01-26 10:20:38.685 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:20:39 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1229155004' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:20:39 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3914856423' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:20:39 compute-1 nova_compute[226322]: 2026-01-26 10:20:39.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:20:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:40.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:40 compute-1 nova_compute[226322]: 2026-01-26 10:20:40.438 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:20:40 compute-1 nova_compute[226322]: 2026-01-26 10:20:40.442 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:20:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:40.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:40 compute-1 ceph-mon[80107]: pgmap v1156: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:20:41 compute-1 nova_compute[226322]: 2026-01-26 10:20:41.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:20:41 compute-1 nova_compute[226322]: 2026-01-26 10:20:41.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:20:41 compute-1 nova_compute[226322]: 2026-01-26 10:20:41.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:20:41 compute-1 nova_compute[226322]: 2026-01-26 10:20:41.706 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 10:20:41 compute-1 nova_compute[226322]: 2026-01-26 10:20:41.706 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:20:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:42.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:20:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:42.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:20:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:20:42 compute-1 ceph-mon[80107]: pgmap v1157: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:20:42 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/4260343528' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:20:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:20:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:20:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:20:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:20:44 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3327021470' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:20:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:44.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:44.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:45 compute-1 ceph-mon[80107]: pgmap v1158: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:20:45 compute-1 nova_compute[226322]: 2026-01-26 10:20:45.440 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:20:45 compute-1 nova_compute[226322]: 2026-01-26 10:20:45.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:20:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:20:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:46.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:20:46 compute-1 sudo[246650]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:20:46 compute-1 sudo[246650]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:20:46 compute-1 sudo[246650]: pam_unix(sudo:session): session closed for user root
Jan 26 10:20:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:20:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:46.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:20:47 compute-1 ceph-mon[80107]: pgmap v1159: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:20:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:20:47 compute-1 nova_compute[226322]: 2026-01-26 10:20:47.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:20:47 compute-1 nova_compute[226322]: 2026-01-26 10:20:47.935 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:20:47 compute-1 nova_compute[226322]: 2026-01-26 10:20:47.936 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:20:47 compute-1 nova_compute[226322]: 2026-01-26 10:20:47.936 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:20:47 compute-1 nova_compute[226322]: 2026-01-26 10:20:47.936 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:20:47 compute-1 nova_compute[226322]: 2026-01-26 10:20:47.937 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:20:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:20:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:20:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:20:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:20:48 compute-1 ceph-mon[80107]: pgmap v1160: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:20:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:48.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:48 compute-1 podman[246696]: 2026-01-26 10:20:48.30360136 +0000 UTC m=+0.084273882 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 26 10:20:48 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:20:48 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/355653575' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:20:48 compute-1 nova_compute[226322]: 2026-01-26 10:20:48.456 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.519s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:20:48 compute-1 nova_compute[226322]: 2026-01-26 10:20:48.620 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:20:48 compute-1 nova_compute[226322]: 2026-01-26 10:20:48.621 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4694MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:20:48 compute-1 nova_compute[226322]: 2026-01-26 10:20:48.622 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:20:48 compute-1 nova_compute[226322]: 2026-01-26 10:20:48.622 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:20:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:48.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:48 compute-1 nova_compute[226322]: 2026-01-26 10:20:48.685 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:20:48 compute-1 nova_compute[226322]: 2026-01-26 10:20:48.686 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:20:48 compute-1 nova_compute[226322]: 2026-01-26 10:20:48.714 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:20:49 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/355653575' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:20:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:20:49 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:20:49 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1812899865' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:20:49 compute-1 nova_compute[226322]: 2026-01-26 10:20:49.230 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:20:49 compute-1 nova_compute[226322]: 2026-01-26 10:20:49.237 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:20:49 compute-1 nova_compute[226322]: 2026-01-26 10:20:49.274 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:20:49 compute-1 nova_compute[226322]: 2026-01-26 10:20:49.275 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:20:49 compute-1 nova_compute[226322]: 2026-01-26 10:20:49.275 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.653s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:20:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:50.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:50 compute-1 ceph-mon[80107]: pgmap v1161: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:20:50 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1812899865' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:20:50 compute-1 nova_compute[226322]: 2026-01-26 10:20:50.441 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:20:50 compute-1 nova_compute[226322]: 2026-01-26 10:20:50.443 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:20:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:50.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:52.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:52 compute-1 ceph-mon[80107]: pgmap v1162: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:20:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:52.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:20:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:20:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:20:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:20:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:20:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:20:53.943 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:20:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:20:53.944 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:20:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:20:53.944 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:20:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:54.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:54 compute-1 ceph-mon[80107]: pgmap v1163: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:20:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:20:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:54.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:20:55 compute-1 nova_compute[226322]: 2026-01-26 10:20:55.444 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:20:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:56.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:56.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:56 compute-1 ceph-mon[80107]: pgmap v1164: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:20:57 compute-1 sshd-session[246750]: Invalid user admin from 178.62.249.31 port 36290
Jan 26 10:20:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:20:57 compute-1 sshd-session[246750]: Connection closed by invalid user admin 178.62.249.31 port 36290 [preauth]
Jan 26 10:20:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:20:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:20:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:20:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:20:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:58.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 10:20:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3073080115' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:20:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 10:20:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3073080115' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:20:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:20:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:20:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:58.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:20:58 compute-1 ceph-mon[80107]: pgmap v1165: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:20:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/3073080115' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:20:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/3073080115' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:20:58 compute-1 sudo[239351]: pam_unix(sudo:session): session closed for user root
Jan 26 10:20:58 compute-1 sshd-session[239350]: Received disconnect from 192.168.122.10 port 60902:11: disconnected by user
Jan 26 10:20:58 compute-1 sshd-session[239350]: Disconnected from user zuul 192.168.122.10 port 60902
Jan 26 10:20:58 compute-1 sshd-session[239347]: pam_unix(sshd:session): session closed for user zuul
Jan 26 10:20:58 compute-1 systemd[1]: session-55.scope: Deactivated successfully.
Jan 26 10:20:58 compute-1 systemd[1]: session-55.scope: Consumed 2min 50.963s CPU time, 640.8M memory peak, read 202.2M from disk, written 65.0M to disk.
Jan 26 10:20:58 compute-1 systemd-logind[783]: Session 55 logged out. Waiting for processes to exit.
Jan 26 10:20:58 compute-1 systemd-logind[783]: Removed session 55.
Jan 26 10:20:59 compute-1 sshd-session[246753]: Accepted publickey for zuul from 192.168.122.10 port 37202 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 10:20:59 compute-1 systemd-logind[783]: New session 56 of user zuul.
Jan 26 10:20:59 compute-1 systemd[1]: Started Session 56 of User zuul.
Jan 26 10:20:59 compute-1 sshd-session[246753]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 10:20:59 compute-1 sudo[246757]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-1-2026-01-26-vwslqwx.tar.xz
Jan 26 10:20:59 compute-1 sudo[246757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:20:59 compute-1 sudo[246757]: pam_unix(sudo:session): session closed for user root
Jan 26 10:20:59 compute-1 sshd-session[246756]: Received disconnect from 192.168.122.10 port 37202:11: disconnected by user
Jan 26 10:20:59 compute-1 sshd-session[246756]: Disconnected from user zuul 192.168.122.10 port 37202
Jan 26 10:20:59 compute-1 sshd-session[246753]: pam_unix(sshd:session): session closed for user zuul
Jan 26 10:20:59 compute-1 systemd[1]: session-56.scope: Deactivated successfully.
Jan 26 10:20:59 compute-1 systemd-logind[783]: Session 56 logged out. Waiting for processes to exit.
Jan 26 10:20:59 compute-1 systemd-logind[783]: Removed session 56.
Jan 26 10:20:59 compute-1 sshd-session[246782]: Accepted publickey for zuul from 192.168.122.10 port 37216 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 10:20:59 compute-1 systemd-logind[783]: New session 57 of user zuul.
Jan 26 10:20:59 compute-1 systemd[1]: Started Session 57 of User zuul.
Jan 26 10:20:59 compute-1 sshd-session[246782]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 10:20:59 compute-1 sudo[246786]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Jan 26 10:20:59 compute-1 sudo[246786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:20:59 compute-1 sudo[246786]: pam_unix(sudo:session): session closed for user root
Jan 26 10:20:59 compute-1 sshd-session[246785]: Received disconnect from 192.168.122.10 port 37216:11: disconnected by user
Jan 26 10:20:59 compute-1 sshd-session[246785]: Disconnected from user zuul 192.168.122.10 port 37216
Jan 26 10:20:59 compute-1 sshd-session[246782]: pam_unix(sshd:session): session closed for user zuul
Jan 26 10:20:59 compute-1 systemd[1]: session-57.scope: Deactivated successfully.
Jan 26 10:20:59 compute-1 systemd-logind[783]: Session 57 logged out. Waiting for processes to exit.
Jan 26 10:20:59 compute-1 systemd-logind[783]: Removed session 57.
Jan 26 10:21:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:00.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:00 compute-1 nova_compute[226322]: 2026-01-26 10:21:00.446 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:21:00 compute-1 nova_compute[226322]: 2026-01-26 10:21:00.448 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:21:00 compute-1 nova_compute[226322]: 2026-01-26 10:21:00.448 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:21:00 compute-1 nova_compute[226322]: 2026-01-26 10:21:00.449 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:21:00 compute-1 nova_compute[226322]: 2026-01-26 10:21:00.450 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:21:00 compute-1 nova_compute[226322]: 2026-01-26 10:21:00.453 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:21:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:00.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:00 compute-1 ceph-mon[80107]: pgmap v1166: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:21:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:02.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:02.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:21:02 compute-1 ceph-mon[80107]: pgmap v1167: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:21:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:21:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:21:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:21:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:21:03 compute-1 sshd-session[246813]: Invalid user centos from 104.248.198.71 port 43786
Jan 26 10:21:03 compute-1 sshd-session[246813]: Connection closed by invalid user centos 104.248.198.71 port 43786 [preauth]
Jan 26 10:21:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:21:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:21:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:04.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:21:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:04.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:04 compute-1 ceph-mon[80107]: pgmap v1168: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:21:05 compute-1 podman[246816]: 2026-01-26 10:21:05.335178458 +0000 UTC m=+0.101505278 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 26 10:21:05 compute-1 nova_compute[226322]: 2026-01-26 10:21:05.450 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:21:05 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 10:21:05 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2400.0 total, 600.0 interval
                                           Cumulative writes: 6966 writes, 36K keys, 6966 commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.04 MB/s
                                           Cumulative WAL: 6966 writes, 6966 syncs, 1.00 writes per sync, written: 0.09 GB, 0.04 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1523 writes, 7934 keys, 1523 commit groups, 1.0 writes per commit group, ingest: 17.68 MB, 0.03 MB/s
                                           Interval WAL: 1523 writes, 1523 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     87.3      0.61              0.17        19    0.032       0      0       0.0       0.0
                                             L6      1/0   14.26 MB   0.0      0.3     0.1      0.2       0.2      0.0       0.0   4.4    144.2    123.5      1.90              0.58        18    0.106    100K    10K       0.0       0.0
                                            Sum      1/0   14.26 MB   0.0      0.3     0.1      0.2       0.3      0.1       0.0   5.4    109.1    114.7      2.52              0.75        37    0.068    100K    10K       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   6.8    109.2    109.8      0.60              0.20         8    0.075     26K   2580       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.3     0.1      0.2       0.2      0.0       0.0   0.0    144.2    123.5      1.90              0.58        18    0.106    100K    10K       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     87.7      0.61              0.17        18    0.034       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 2400.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.052, interval 0.009
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.28 GB write, 0.12 MB/s write, 0.27 GB read, 0.11 MB/s read, 2.5 seconds
                                           Interval compaction: 0.06 GB write, 0.11 MB/s write, 0.06 GB read, 0.11 MB/s read, 0.6 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55af2cb209b0#2 capacity: 304.00 MB usage: 24.02 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.000242 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1466,23.24 MB,7.64475%) FilterBlock(37,289.36 KB,0.0929531%) IndexBlock(37,511.45 KB,0.164298%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Jan 26 10:21:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:06.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:06 compute-1 sudo[246838]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:21:06 compute-1 sudo[246838]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:21:06 compute-1 sudo[246838]: pam_unix(sudo:session): session closed for user root
Jan 26 10:21:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:06.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:06 compute-1 ceph-mon[80107]: pgmap v1169: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:21:07 compute-1 sshd-session[246863]: Invalid user admin from 152.42.136.110 port 41606
Jan 26 10:21:07 compute-1 sshd-session[246863]: Connection closed by invalid user admin 152.42.136.110 port 41606 [preauth]
Jan 26 10:21:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:21:07 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Jan 26 10:21:07 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:07.988815) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 10:21:07 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Jan 26 10:21:07 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422867988846, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 2079, "num_deletes": 506, "total_data_size": 3966421, "memory_usage": 4016848, "flush_reason": "Manual Compaction"}
Jan 26 10:21:07 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Jan 26 10:21:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:21:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:21:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:21:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422868010612, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 2581045, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 34788, "largest_seqno": 36862, "table_properties": {"data_size": 2571773, "index_size": 4934, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3269, "raw_key_size": 27614, "raw_average_key_size": 21, "raw_value_size": 2549462, "raw_average_value_size": 1967, "num_data_blocks": 212, "num_entries": 1296, "num_filter_entries": 1296, "num_deletions": 506, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769422755, "oldest_key_time": 1769422755, "file_creation_time": 1769422867, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 21961 microseconds, and 8091 cpu microseconds.
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.010765) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 2581045 bytes OK
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.010795) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.012999) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.013030) EVENT_LOG_v1 {"time_micros": 1769422868013021, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.013058) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 3954998, prev total WAL file size 3954998, number of live WAL files 2.
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.014740) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(2520KB)], [66(14MB)]
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422868014798, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 17531538, "oldest_snapshot_seqno": -1}
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 6549 keys, 15261766 bytes, temperature: kUnknown
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422868156552, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 15261766, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15216930, "index_size": 27380, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16389, "raw_key_size": 172139, "raw_average_key_size": 26, "raw_value_size": 15097796, "raw_average_value_size": 2305, "num_data_blocks": 1082, "num_entries": 6549, "num_filter_entries": 6549, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769422868, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.156945) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 15261766 bytes
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.158275) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 123.5 rd, 107.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.5, 14.3 +0.0 blob) out(14.6 +0.0 blob), read-write-amplify(12.7) write-amplify(5.9) OK, records in: 7576, records dropped: 1027 output_compression: NoCompression
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.158303) EVENT_LOG_v1 {"time_micros": 1769422868158291, "job": 40, "event": "compaction_finished", "compaction_time_micros": 141923, "compaction_time_cpu_micros": 53028, "output_level": 6, "num_output_files": 1, "total_output_size": 15261766, "num_input_records": 7576, "num_output_records": 6549, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422868159309, "job": 40, "event": "table_file_deletion", "file_number": 68}
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422868164331, "job": 40, "event": "table_file_deletion", "file_number": 66}
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.014638) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.164387) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.164392) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.164394) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.164395) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:21:08 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.164397) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:21:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:21:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:08.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:21:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:08.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:09 compute-1 ceph-mon[80107]: pgmap v1170: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:21:10 compute-1 ceph-mon[80107]: pgmap v1171: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:21:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:10.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:10 compute-1 nova_compute[226322]: 2026-01-26 10:21:10.453 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:21:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:21:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:10.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:21:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:21:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:12.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:21:12 compute-1 ceph-mon[80107]: pgmap v1172: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:21:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:21:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:21:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:12.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:21:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:21:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:21:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:21:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:21:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:14.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:14 compute-1 ceph-mon[80107]: pgmap v1173: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:21:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:21:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:14.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:21:15 compute-1 nova_compute[226322]: 2026-01-26 10:21:15.455 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:21:15 compute-1 nova_compute[226322]: 2026-01-26 10:21:15.457 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:21:15 compute-1 nova_compute[226322]: 2026-01-26 10:21:15.457 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:21:15 compute-1 nova_compute[226322]: 2026-01-26 10:21:15.457 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:21:15 compute-1 nova_compute[226322]: 2026-01-26 10:21:15.507 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:21:15 compute-1 nova_compute[226322]: 2026-01-26 10:21:15.508 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:21:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:21:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:16.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:21:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:21:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:16.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:21:16 compute-1 ceph-mon[80107]: pgmap v1174: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:21:17 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:21:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:21:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:21:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:21:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:21:18 compute-1 sudo[246871]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:21:18 compute-1 sudo[246871]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:21:18 compute-1 sudo[246871]: pam_unix(sudo:session): session closed for user root
Jan 26 10:21:18 compute-1 sudo[246896]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Jan 26 10:21:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:21:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:18.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:21:18 compute-1 sudo[246896]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:21:18 compute-1 sudo[246896]: pam_unix(sudo:session): session closed for user root
Jan 26 10:21:18 compute-1 podman[246934]: 2026-01-26 10:21:18.590719817 +0000 UTC m=+0.093868606 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 26 10:21:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:21:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:18.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:21:18 compute-1 ceph-mon[80107]: pgmap v1175: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:21:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:21:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:21:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:21:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:21:18 compute-1 sudo[246967]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:21:18 compute-1 sudo[246967]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:21:18 compute-1 sudo[246967]: pam_unix(sudo:session): session closed for user root
Jan 26 10:21:18 compute-1 sudo[246992]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:21:18 compute-1 sudo[246992]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:21:19 compute-1 sudo[246992]: pam_unix(sudo:session): session closed for user root
Jan 26 10:21:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:21:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:21:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:21:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:21:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:20.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:21:20 compute-1 nova_compute[226322]: 2026-01-26 10:21:20.508 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:21:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:21:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:20.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:21:20 compute-1 ceph-mon[80107]: pgmap v1176: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:21:20 compute-1 ceph-mon[80107]: pgmap v1177: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:21:20 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:21:20 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:21:20 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:21:20 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:21:20 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:21:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:22.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:21:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:22.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:22 compute-1 ceph-mon[80107]: pgmap v1178: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:21:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:21:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:21:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:21:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:21:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:24.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:24.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:25 compute-1 ceph-mon[80107]: pgmap v1179: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:21:25 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:21:25 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:21:25 compute-1 sudo[247052]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:21:25 compute-1 sudo[247052]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:21:25 compute-1 sudo[247052]: pam_unix(sudo:session): session closed for user root
Jan 26 10:21:25 compute-1 nova_compute[226322]: 2026-01-26 10:21:25.508 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:21:25 compute-1 nova_compute[226322]: 2026-01-26 10:21:25.510 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:21:26 compute-1 ceph-mon[80107]: pgmap v1180: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:21:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:21:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:26.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:21:26 compute-1 sudo[247078]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:21:26 compute-1 sudo[247078]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:21:26 compute-1 sudo[247078]: pam_unix(sudo:session): session closed for user root
Jan 26 10:21:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:21:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:26.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:21:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:21:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:21:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:21:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:21:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:21:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:28.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:28 compute-1 ceph-mon[80107]: pgmap v1181: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:21:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:28.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:30.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:30 compute-1 nova_compute[226322]: 2026-01-26 10:21:30.509 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:21:30 compute-1 nova_compute[226322]: 2026-01-26 10:21:30.511 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:21:30 compute-1 ceph-mon[80107]: pgmap v1182: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:21:30 compute-1 nova_compute[226322]: 2026-01-26 10:21:30.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:21:30 compute-1 nova_compute[226322]: 2026-01-26 10:21:30.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 26 10:21:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:30.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:32.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:32.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:21:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:21:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:21:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:21:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:21:33 compute-1 ceph-mon[80107]: pgmap v1183: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:21:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000054s ======
Jan 26 10:21:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:34.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 26 10:21:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:21:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:34.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:21:34 compute-1 ceph-mon[80107]: pgmap v1184: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:21:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:21:35 compute-1 nova_compute[226322]: 2026-01-26 10:21:35.510 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:21:35 compute-1 nova_compute[226322]: 2026-01-26 10:21:35.512 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:21:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:36.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:36 compute-1 podman[247108]: 2026-01-26 10:21:36.295427006 +0000 UTC m=+0.065923956 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 26 10:21:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:36.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:36 compute-1 ceph-mon[80107]: pgmap v1185: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:21:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:21:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:21:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:21:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:21:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:21:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:38.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:38 compute-1 nova_compute[226322]: 2026-01-26 10:21:38.704 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:21:38 compute-1 nova_compute[226322]: 2026-01-26 10:21:38.705 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:21:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:38.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:38 compute-1 ceph-mon[80107]: pgmap v1186: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:21:39 compute-1 nova_compute[226322]: 2026-01-26 10:21:39.680 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:21:39 compute-1 nova_compute[226322]: 2026-01-26 10:21:39.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:21:39 compute-1 nova_compute[226322]: 2026-01-26 10:21:39.685 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:21:39 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Jan 26 10:21:39 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:39.920914) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 10:21:39 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Jan 26 10:21:39 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422899920948, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 591, "num_deletes": 250, "total_data_size": 1124047, "memory_usage": 1149016, "flush_reason": "Manual Compaction"}
Jan 26 10:21:39 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Jan 26 10:21:39 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422899927697, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 590931, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36867, "largest_seqno": 37453, "table_properties": {"data_size": 587990, "index_size": 913, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7809, "raw_average_key_size": 21, "raw_value_size": 581968, "raw_average_value_size": 1572, "num_data_blocks": 37, "num_entries": 370, "num_filter_entries": 370, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769422869, "oldest_key_time": 1769422869, "file_creation_time": 1769422899, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:21:39 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 6885 microseconds, and 2567 cpu microseconds.
Jan 26 10:21:39 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:21:39 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:39.927794) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 590931 bytes OK
Jan 26 10:21:39 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:39.927816) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Jan 26 10:21:39 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:39.930791) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Jan 26 10:21:39 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:39.930805) EVENT_LOG_v1 {"time_micros": 1769422899930801, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 10:21:39 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:39.930822) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 10:21:39 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 1120680, prev total WAL file size 1120680, number of live WAL files 2.
Jan 26 10:21:39 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:21:39 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:39.931505) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303030' seq:72057594037927935, type:22 .. '6D6772737461740031323531' seq:0, type:0; will stop at (end)
Jan 26 10:21:39 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 10:21:39 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(577KB)], [69(14MB)]
Jan 26 10:21:39 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422899931553, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 15852697, "oldest_snapshot_seqno": -1}
Jan 26 10:21:40 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6411 keys, 11851886 bytes, temperature: kUnknown
Jan 26 10:21:40 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422900035481, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 11851886, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11812376, "index_size": 22376, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16069, "raw_key_size": 169458, "raw_average_key_size": 26, "raw_value_size": 11700025, "raw_average_value_size": 1824, "num_data_blocks": 874, "num_entries": 6411, "num_filter_entries": 6411, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769422899, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:21:40 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:21:40 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:40.036227) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 11851886 bytes
Jan 26 10:21:40 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:40.039425) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 151.9 rd, 113.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 14.6 +0.0 blob) out(11.3 +0.0 blob), read-write-amplify(46.9) write-amplify(20.1) OK, records in: 6919, records dropped: 508 output_compression: NoCompression
Jan 26 10:21:40 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:40.039469) EVENT_LOG_v1 {"time_micros": 1769422900039453, "job": 42, "event": "compaction_finished", "compaction_time_micros": 104367, "compaction_time_cpu_micros": 32261, "output_level": 6, "num_output_files": 1, "total_output_size": 11851886, "num_input_records": 6919, "num_output_records": 6411, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 10:21:40 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:21:40 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422900039810, "job": 42, "event": "table_file_deletion", "file_number": 71}
Jan 26 10:21:40 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1482975074' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:21:40 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:21:40 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422900043767, "job": 42, "event": "table_file_deletion", "file_number": 69}
Jan 26 10:21:40 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:39.931418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:21:40 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:40.043941) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:21:40 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:40.043950) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:21:40 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:40.043954) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:21:40 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:40.043957) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:21:40 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:40.043960) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:21:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:40.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:40 compute-1 nova_compute[226322]: 2026-01-26 10:21:40.513 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:21:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:21:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:40.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:21:41 compute-1 ceph-mon[80107]: pgmap v1187: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:21:41 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2032313011' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:21:41 compute-1 nova_compute[226322]: 2026-01-26 10:21:41.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:21:41 compute-1 nova_compute[226322]: 2026-01-26 10:21:41.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:21:41 compute-1 nova_compute[226322]: 2026-01-26 10:21:41.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:21:41 compute-1 nova_compute[226322]: 2026-01-26 10:21:41.716 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 10:21:41 compute-1 nova_compute[226322]: 2026-01-26 10:21:41.717 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:21:41 compute-1 nova_compute[226322]: 2026-01-26 10:21:41.717 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:21:41 compute-1 nova_compute[226322]: 2026-01-26 10:21:41.717 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 26 10:21:41 compute-1 nova_compute[226322]: 2026-01-26 10:21:41.747 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 26 10:21:42 compute-1 sshd-session[247130]: Invalid user admin from 178.62.249.31 port 55136
Jan 26 10:21:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:42.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:42 compute-1 sshd-session[247130]: Connection closed by invalid user admin 178.62.249.31 port 55136 [preauth]
Jan 26 10:21:42 compute-1 ceph-mon[80107]: pgmap v1188: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:21:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:21:42 compute-1 nova_compute[226322]: 2026-01-26 10:21:42.717 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:21:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:42.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:21:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:21:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:21:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:21:43 compute-1 nova_compute[226322]: 2026-01-26 10:21:43.682 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:21:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:21:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:44.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:21:44 compute-1 ceph-mon[80107]: pgmap v1189: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:21:44 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1538737131' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:21:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:44.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:45 compute-1 nova_compute[226322]: 2026-01-26 10:21:45.515 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:21:45 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2110077342' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:21:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:46.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:46 compute-1 sudo[247134]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:21:46 compute-1 sudo[247134]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:21:46 compute-1 sudo[247134]: pam_unix(sudo:session): session closed for user root
Jan 26 10:21:46 compute-1 nova_compute[226322]: 2026-01-26 10:21:46.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:21:46 compute-1 nova_compute[226322]: 2026-01-26 10:21:46.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:21:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:21:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:46.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:21:46 compute-1 ceph-mon[80107]: pgmap v1190: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:21:47 compute-1 sshd-session[247159]: Invalid user admin from 152.42.136.110 port 45658
Jan 26 10:21:47 compute-1 sshd-session[247159]: Connection closed by invalid user admin 152.42.136.110 port 45658 [preauth]
Jan 26 10:21:47 compute-1 nova_compute[226322]: 2026-01-26 10:21:47.704 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:21:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:21:47 compute-1 nova_compute[226322]: 2026-01-26 10:21:47.733 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:21:47 compute-1 nova_compute[226322]: 2026-01-26 10:21:47.733 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:21:47 compute-1 nova_compute[226322]: 2026-01-26 10:21:47.733 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:21:47 compute-1 nova_compute[226322]: 2026-01-26 10:21:47.733 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:21:47 compute-1 nova_compute[226322]: 2026-01-26 10:21:47.734 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:21:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:21:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:21:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:21:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:21:48 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:21:48 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1170456353' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:21:48 compute-1 nova_compute[226322]: 2026-01-26 10:21:48.174 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:21:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:21:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:48.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:21:48 compute-1 nova_compute[226322]: 2026-01-26 10:21:48.335 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:21:48 compute-1 nova_compute[226322]: 2026-01-26 10:21:48.336 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4831MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:21:48 compute-1 nova_compute[226322]: 2026-01-26 10:21:48.336 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:21:48 compute-1 nova_compute[226322]: 2026-01-26 10:21:48.337 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:21:48 compute-1 nova_compute[226322]: 2026-01-26 10:21:48.581 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:21:48 compute-1 nova_compute[226322]: 2026-01-26 10:21:48.582 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:21:48 compute-1 nova_compute[226322]: 2026-01-26 10:21:48.644 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:21:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:48.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:48 compute-1 ceph-mon[80107]: pgmap v1191: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:21:48 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1170456353' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:21:48 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:21:49 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:21:49 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/737420374' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:21:49 compute-1 nova_compute[226322]: 2026-01-26 10:21:49.069 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:21:49 compute-1 nova_compute[226322]: 2026-01-26 10:21:49.076 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:21:49 compute-1 nova_compute[226322]: 2026-01-26 10:21:49.120 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:21:49 compute-1 nova_compute[226322]: 2026-01-26 10:21:49.121 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:21:49 compute-1 nova_compute[226322]: 2026-01-26 10:21:49.122 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.785s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:21:49 compute-1 podman[247206]: 2026-01-26 10:21:49.341645109 +0000 UTC m=+0.120651966 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 26 10:21:49 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/737420374' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:21:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:21:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:50.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:21:50 compute-1 sshd-session[247234]: Invalid user centos from 104.248.198.71 port 52544
Jan 26 10:21:50 compute-1 sshd-session[247234]: Connection closed by invalid user centos 104.248.198.71 port 52544 [preauth]
Jan 26 10:21:50 compute-1 nova_compute[226322]: 2026-01-26 10:21:50.515 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:21:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:21:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:50.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:21:50 compute-1 ceph-mon[80107]: pgmap v1192: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:21:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:52.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:21:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:52.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:21:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:21:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:21:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:21:53 compute-1 ceph-mon[80107]: pgmap v1193: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:21:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:21:53.945 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:21:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:21:53.946 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:21:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:21:53.946 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:21:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:54.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:54.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:55 compute-1 ceph-mon[80107]: pgmap v1194: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:21:55 compute-1 nova_compute[226322]: 2026-01-26 10:21:55.519 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:21:56 compute-1 nova_compute[226322]: 2026-01-26 10:21:56.064 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:21:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:21:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:56.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:21:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:21:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:56.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:21:57 compute-1 ceph-mon[80107]: pgmap v1195: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:21:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:21:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:21:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:21:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:21:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:21:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 10:21:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3956459805' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:21:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 10:21:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3956459805' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:21:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:21:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:58.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:21:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:21:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:21:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:58.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:21:59 compute-1 ceph-mon[80107]: pgmap v1196: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:21:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/3956459805' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:21:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/3956459805' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:22:00 compute-1 ceph-mon[80107]: pgmap v1197: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:22:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:22:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:00.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:22:00 compute-1 nova_compute[226322]: 2026-01-26 10:22:00.518 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:22:00 compute-1 nova_compute[226322]: 2026-01-26 10:22:00.521 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:22:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:00.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:22:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:02.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:22:02 compute-1 ceph-mon[80107]: pgmap v1198: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:22:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:22:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:22:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:02.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:22:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:22:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:22:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:22:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:22:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:22:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:04.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:22:04 compute-1 ceph-mon[80107]: pgmap v1199: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:22:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:22:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:04.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:05 compute-1 nova_compute[226322]: 2026-01-26 10:22:05.520 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:22:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:22:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:06.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:22:06 compute-1 ceph-mon[80107]: pgmap v1200: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:22:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:06.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:06 compute-1 sudo[247244]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:22:06 compute-1 sudo[247244]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:22:06 compute-1 sudo[247244]: pam_unix(sudo:session): session closed for user root
Jan 26 10:22:06 compute-1 podman[247268]: 2026-01-26 10:22:06.861692597 +0000 UTC m=+0.082441463 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 10:22:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:22:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:22:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:22:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:22:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:22:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:22:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:08.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:22:08 compute-1 ceph-mon[80107]: pgmap v1201: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:22:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:22:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:08.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:22:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:10.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:10 compute-1 nova_compute[226322]: 2026-01-26 10:22:10.522 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:22:10 compute-1 ceph-mon[80107]: pgmap v1202: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:22:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:10.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:12.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:12 compute-1 ceph-mon[80107]: pgmap v1203: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:22:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:22:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:12.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:22:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:22:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:22:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:22:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:14.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:14 compute-1 ceph-mon[80107]: pgmap v1204: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:22:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:14.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:15 compute-1 nova_compute[226322]: 2026-01-26 10:22:15.524 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:22:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:22:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:16.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:22:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:22:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:16.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:22:16 compute-1 ceph-mon[80107]: pgmap v1205: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:22:17 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:22:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:22:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:22:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:22:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:22:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:22:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:18.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:22:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:22:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:18.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:22:18 compute-1 ceph-mon[80107]: pgmap v1206: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:22:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:22:19 compute-1 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 26 10:22:19 compute-1 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 26 10:22:19 compute-1 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 26 10:22:19 compute-1 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Jan 26 10:22:19 compute-1 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Jan 26 10:22:19 compute-1 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Jan 26 10:22:19 compute-1 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Jan 26 10:22:19 compute-1 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Jan 26 10:22:19 compute-1 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Jan 26 10:22:20 compute-1 podman[247296]: 2026-01-26 10:22:20.321747001 +0000 UTC m=+0.094677174 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 10:22:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:20.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:20 compute-1 nova_compute[226322]: 2026-01-26 10:22:20.525 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:22:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:20.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:21 compute-1 ceph-mon[80107]: pgmap v1207: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:22:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:22:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:22.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:22:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:22:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:22.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:22:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:22:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:22:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:22:23 compute-1 ceph-mon[80107]: pgmap v1208: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 4.8 KiB/s rd, 0 B/s wr, 7 op/s
Jan 26 10:22:24 compute-1 ceph-mon[80107]: pgmap v1209: 353 pgs: 353 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 4.3 KiB/s rd, 0 B/s wr, 6 op/s
Jan 26 10:22:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:24.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:22:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:24.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:22:25 compute-1 sudo[247323]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:22:25 compute-1 sudo[247323]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:22:25 compute-1 sudo[247323]: pam_unix(sudo:session): session closed for user root
Jan 26 10:22:25 compute-1 sudo[247348]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:22:25 compute-1 sudo[247348]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:22:25 compute-1 nova_compute[226322]: 2026-01-26 10:22:25.527 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:22:25 compute-1 nova_compute[226322]: 2026-01-26 10:22:25.529 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:22:25 compute-1 sudo[247348]: pam_unix(sudo:session): session closed for user root
Jan 26 10:22:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:26.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:26 compute-1 ceph-mon[80107]: pgmap v1210: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 48 KiB/s rd, 0 B/s wr, 79 op/s
Jan 26 10:22:26 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:22:26 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:22:26 compute-1 ceph-mon[80107]: pgmap v1211: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 55 KiB/s rd, 0 B/s wr, 90 op/s
Jan 26 10:22:26 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:22:26 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:22:26 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:22:26 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:22:26 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:22:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:26.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:26 compute-1 sudo[247406]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:22:26 compute-1 sudo[247406]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:22:26 compute-1 sudo[247406]: pam_unix(sudo:session): session closed for user root
Jan 26 10:22:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:22:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:22:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:22:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:22:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:22:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:28.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:28 compute-1 sshd-session[247432]: Invalid user admin from 152.42.136.110 port 34782
Jan 26 10:22:28 compute-1 sshd-session[247432]: Connection closed by invalid user admin 152.42.136.110 port 34782 [preauth]
Jan 26 10:22:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:28.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:29 compute-1 ceph-mon[80107]: pgmap v1212: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 55 KiB/s rd, 0 B/s wr, 90 op/s
Jan 26 10:22:30 compute-1 ceph-mon[80107]: pgmap v1213: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 55 KiB/s rd, 0 B/s wr, 91 op/s
Jan 26 10:22:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:22:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:30.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:22:30 compute-1 nova_compute[226322]: 2026-01-26 10:22:30.529 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:22:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:22:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:30.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:22:32 compute-1 sudo[247436]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:22:32 compute-1 sudo[247436]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:22:32 compute-1 sudo[247436]: pam_unix(sudo:session): session closed for user root
Jan 26 10:22:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:32.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:22:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:22:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:32.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:22:32 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:22:32 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:22:32 compute-1 ceph-mon[80107]: pgmap v1214: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 51 KiB/s rd, 0 B/s wr, 84 op/s
Jan 26 10:22:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:22:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:22:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:22:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:22:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:22:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:34.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:22:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:34.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:22:35 compute-1 ceph-mon[80107]: pgmap v1215: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 51 KiB/s rd, 0 B/s wr, 84 op/s
Jan 26 10:22:35 compute-1 nova_compute[226322]: 2026-01-26 10:22:35.530 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:22:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:36.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:36 compute-1 sshd-session[247463]: Invalid user test from 104.248.198.71 port 38174
Jan 26 10:22:36 compute-1 sshd-session[247463]: Connection closed by invalid user test 104.248.198.71 port 38174 [preauth]
Jan 26 10:22:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:36.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:37 compute-1 ceph-mon[80107]: pgmap v1216: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 1 op/s
Jan 26 10:22:37 compute-1 podman[247465]: 2026-01-26 10:22:37.277962793 +0000 UTC m=+0.056687966 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 10:22:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:22:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:22:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:22:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:22:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:22:38 compute-1 ceph-mon[80107]: pgmap v1217: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:22:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:38.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:22:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:38.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:22:39 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/384900377' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:22:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:22:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:40.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:22:40 compute-1 nova_compute[226322]: 2026-01-26 10:22:40.532 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:22:40 compute-1 ceph-mon[80107]: pgmap v1218: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:22:40 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1580551924' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:22:40 compute-1 nova_compute[226322]: 2026-01-26 10:22:40.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:22:40 compute-1 nova_compute[226322]: 2026-01-26 10:22:40.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:22:40 compute-1 nova_compute[226322]: 2026-01-26 10:22:40.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:22:40 compute-1 nova_compute[226322]: 2026-01-26 10:22:40.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:22:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:40.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:41 compute-1 nova_compute[226322]: 2026-01-26 10:22:41.682 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:22:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:42.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:42 compute-1 nova_compute[226322]: 2026-01-26 10:22:42.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:22:42 compute-1 nova_compute[226322]: 2026-01-26 10:22:42.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:22:42 compute-1 nova_compute[226322]: 2026-01-26 10:22:42.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:22:42 compute-1 nova_compute[226322]: 2026-01-26 10:22:42.704 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 10:22:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:22:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:22:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:42.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:22:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:22:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:22:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:22:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:22:43 compute-1 ceph-mon[80107]: pgmap v1219: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:22:43 compute-1 nova_compute[226322]: 2026-01-26 10:22:43.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:22:43 compute-1 nova_compute[226322]: 2026-01-26 10:22:43.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:22:44 compute-1 ceph-mon[80107]: pgmap v1220: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:22:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:22:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:44.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:22:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:44.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:45 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3419026471' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:22:45 compute-1 nova_compute[226322]: 2026-01-26 10:22:45.533 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:22:45 compute-1 nova_compute[226322]: 2026-01-26 10:22:45.535 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:22:45 compute-1 nova_compute[226322]: 2026-01-26 10:22:45.535 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:22:45 compute-1 nova_compute[226322]: 2026-01-26 10:22:45.535 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:22:45 compute-1 nova_compute[226322]: 2026-01-26 10:22:45.536 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:22:45 compute-1 nova_compute[226322]: 2026-01-26 10:22:45.537 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:22:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:22:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:46.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:22:46 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1792820060' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:22:46 compute-1 ceph-mon[80107]: pgmap v1221: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:22:46 compute-1 nova_compute[226322]: 2026-01-26 10:22:46.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:22:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:46.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:46 compute-1 sudo[247489]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:22:46 compute-1 sudo[247489]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:22:46 compute-1 sudo[247489]: pam_unix(sudo:session): session closed for user root
Jan 26 10:22:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:22:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:22:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:22:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:22:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:22:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:48.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:48.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:49 compute-1 ceph-mon[80107]: pgmap v1222: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:22:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:22:49 compute-1 nova_compute[226322]: 2026-01-26 10:22:49.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:22:49 compute-1 nova_compute[226322]: 2026-01-26 10:22:49.743 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:22:49 compute-1 nova_compute[226322]: 2026-01-26 10:22:49.744 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:22:49 compute-1 nova_compute[226322]: 2026-01-26 10:22:49.744 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:22:49 compute-1 nova_compute[226322]: 2026-01-26 10:22:49.745 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:22:49 compute-1 nova_compute[226322]: 2026-01-26 10:22:49.745 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:22:50 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:22:50 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1151407283' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:22:50 compute-1 nova_compute[226322]: 2026-01-26 10:22:50.227 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:22:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:50.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:50 compute-1 nova_compute[226322]: 2026-01-26 10:22:50.430 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:22:50 compute-1 nova_compute[226322]: 2026-01-26 10:22:50.432 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4848MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:22:50 compute-1 nova_compute[226322]: 2026-01-26 10:22:50.432 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:22:50 compute-1 nova_compute[226322]: 2026-01-26 10:22:50.433 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:22:50 compute-1 nova_compute[226322]: 2026-01-26 10:22:50.516 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:22:50 compute-1 nova_compute[226322]: 2026-01-26 10:22:50.516 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:22:50 compute-1 nova_compute[226322]: 2026-01-26 10:22:50.536 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:22:50 compute-1 nova_compute[226322]: 2026-01-26 10:22:50.587 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing inventories for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 26 10:22:50 compute-1 nova_compute[226322]: 2026-01-26 10:22:50.690 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating ProviderTree inventory for provider d06842a0-5d13-4573-bb78-d433bbb380e4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 26 10:22:50 compute-1 nova_compute[226322]: 2026-01-26 10:22:50.691 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating inventory in ProviderTree for provider d06842a0-5d13-4573-bb78-d433bbb380e4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 10:22:50 compute-1 nova_compute[226322]: 2026-01-26 10:22:50.713 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing aggregate associations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 26 10:22:50 compute-1 nova_compute[226322]: 2026-01-26 10:22:50.733 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing trait associations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4, traits: HW_CPU_X86_CLMUL,HW_CPU_X86_SSE,HW_CPU_X86_AVX2,HW_CPU_X86_SHA,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_FMA3,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SVM,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 26 10:22:50 compute-1 nova_compute[226322]: 2026-01-26 10:22:50.761 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:22:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:50.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:51 compute-1 ceph-mon[80107]: pgmap v1223: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:22:51 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1151407283' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:22:51 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:22:51 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3104688942' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:22:51 compute-1 nova_compute[226322]: 2026-01-26 10:22:51.234 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:22:51 compute-1 nova_compute[226322]: 2026-01-26 10:22:51.240 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:22:51 compute-1 nova_compute[226322]: 2026-01-26 10:22:51.261 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:22:51 compute-1 nova_compute[226322]: 2026-01-26 10:22:51.263 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:22:51 compute-1 nova_compute[226322]: 2026-01-26 10:22:51.263 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.830s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:22:51 compute-1 podman[247558]: 2026-01-26 10:22:51.344841397 +0000 UTC m=+0.114665108 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 26 10:22:52 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3104688942' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:22:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:52.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:22:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:52.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:22:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:22:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:22:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:22:53 compute-1 ceph-mon[80107]: pgmap v1224: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:22:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:22:53.947 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:22:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:22:53.947 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:22:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:22:53.948 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:22:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:54.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:54.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:55 compute-1 ceph-mon[80107]: pgmap v1225: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:22:55 compute-1 nova_compute[226322]: 2026-01-26 10:22:55.539 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:22:56 compute-1 ceph-mon[80107]: pgmap v1226: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:22:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:56.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:56.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:22:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:22:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:22:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:22:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:22:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:58.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:22:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:22:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:58.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:22:59 compute-1 ceph-mon[80107]: pgmap v1227: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:22:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/2992943752' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:22:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/2992943752' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:23:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:00.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:00 compute-1 nova_compute[226322]: 2026-01-26 10:23:00.540 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:23:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:00.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:01 compute-1 ceph-mon[80107]: pgmap v1228: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 37 KiB/s rd, 0 B/s wr, 61 op/s
Jan 26 10:23:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:02.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:23:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:23:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:02.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:23:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:23:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:23:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:23:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:23:03 compute-1 ceph-mon[80107]: pgmap v1229: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 37 KiB/s rd, 0 B/s wr, 60 op/s
Jan 26 10:23:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:23:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:23:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:04.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:23:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:23:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:04.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:23:05 compute-1 ceph-mon[80107]: pgmap v1230: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 37 KiB/s rd, 0 B/s wr, 60 op/s
Jan 26 10:23:05 compute-1 nova_compute[226322]: 2026-01-26 10:23:05.542 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:23:06 compute-1 ceph-mon[80107]: pgmap v1231: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 37 KiB/s rd, 0 B/s wr, 61 op/s
Jan 26 10:23:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:06.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:23:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:06.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:23:07 compute-1 sudo[247596]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:23:07 compute-1 sudo[247596]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:23:07 compute-1 sudo[247596]: pam_unix(sudo:session): session closed for user root
Jan 26 10:23:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:23:07 compute-1 sshd-session[247594]: Invalid user admin from 152.42.136.110 port 48630
Jan 26 10:23:07 compute-1 podman[247622]: 2026-01-26 10:23:07.878095378 +0000 UTC m=+0.054256999 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 10:23:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:23:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:23:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:23:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:23:08 compute-1 sshd-session[247594]: Connection closed by invalid user admin 152.42.136.110 port 48630 [preauth]
Jan 26 10:23:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:08.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:08.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:09 compute-1 ceph-mon[80107]: pgmap v1232: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 37 KiB/s rd, 0 B/s wr, 60 op/s
Jan 26 10:23:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:10.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:10 compute-1 nova_compute[226322]: 2026-01-26 10:23:10.544 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:23:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:23:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:10.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:23:11 compute-1 ceph-mon[80107]: pgmap v1233: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 37 KiB/s rd, 0 B/s wr, 61 op/s
Jan 26 10:23:12 compute-1 ceph-mon[80107]: pgmap v1234: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:23:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:23:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:12.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:23:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:23:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:12.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:23:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:23:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:23:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:23:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:23:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:14.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:23:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:23:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:14.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:23:15 compute-1 ceph-mon[80107]: pgmap v1235: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:23:15 compute-1 nova_compute[226322]: 2026-01-26 10:23:15.546 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:23:16 compute-1 ceph-mon[80107]: pgmap v1236: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:23:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:16.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:23:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:16.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:23:17 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:23:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:23:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:23:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:23:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:23:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:23:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:18.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:23:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:23:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:18.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:23:19 compute-1 ceph-mon[80107]: pgmap v1237: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:23:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:23:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:20.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:20 compute-1 nova_compute[226322]: 2026-01-26 10:23:20.548 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:23:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:20.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:21 compute-1 ceph-mon[80107]: pgmap v1238: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:23:22 compute-1 podman[247649]: 2026-01-26 10:23:22.386942872 +0000 UTC m=+0.156996386 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 26 10:23:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:23:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:22.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:23:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:23:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:23:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:22.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:23:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:23:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:23:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:23:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:23:23 compute-1 ceph-mon[80107]: pgmap v1239: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:23:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:23:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:24.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:23:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:24.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:25 compute-1 ceph-mon[80107]: pgmap v1240: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:23:25 compute-1 nova_compute[226322]: 2026-01-26 10:23:25.550 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:23:25 compute-1 nova_compute[226322]: 2026-01-26 10:23:25.552 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:23:25 compute-1 nova_compute[226322]: 2026-01-26 10:23:25.553 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:23:25 compute-1 nova_compute[226322]: 2026-01-26 10:23:25.553 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:23:25 compute-1 nova_compute[226322]: 2026-01-26 10:23:25.565 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:23:25 compute-1 nova_compute[226322]: 2026-01-26 10:23:25.565 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:23:26 compute-1 ceph-mon[80107]: pgmap v1241: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:23:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:23:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:26.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:23:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:26.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:27 compute-1 sudo[247679]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:23:27 compute-1 sudo[247679]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:23:27 compute-1 sudo[247679]: pam_unix(sudo:session): session closed for user root
Jan 26 10:23:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:23:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:23:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:23:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:23:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:23:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:23:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:28.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:23:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:28.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:29 compute-1 ceph-mon[80107]: pgmap v1242: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:23:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:30.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:30 compute-1 nova_compute[226322]: 2026-01-26 10:23:30.566 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:23:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:30.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:31 compute-1 ceph-mon[80107]: pgmap v1243: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:23:32 compute-1 sudo[247707]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:23:32 compute-1 sudo[247707]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:23:32 compute-1 sudo[247707]: pam_unix(sudo:session): session closed for user root
Jan 26 10:23:32 compute-1 sudo[247732]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:23:32 compute-1 sudo[247732]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:23:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:32.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:23:32 compute-1 ceph-mon[80107]: pgmap v1244: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:23:32 compute-1 sudo[247732]: pam_unix(sudo:session): session closed for user root
Jan 26 10:23:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:32.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:23:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:23:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:23:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:23:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:23:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:23:33 compute-1 ceph-mon[80107]: pgmap v1245: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 1 op/s
Jan 26 10:23:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:23:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:23:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:23:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:23:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:23:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:23:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:34.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:23:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:23:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:34.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:35 compute-1 nova_compute[226322]: 2026-01-26 10:23:35.567 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:23:35 compute-1 nova_compute[226322]: 2026-01-26 10:23:35.568 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:23:35 compute-1 nova_compute[226322]: 2026-01-26 10:23:35.568 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:23:35 compute-1 nova_compute[226322]: 2026-01-26 10:23:35.568 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:23:35 compute-1 nova_compute[226322]: 2026-01-26 10:23:35.570 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:23:35 compute-1 nova_compute[226322]: 2026-01-26 10:23:35.571 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:23:35 compute-1 ceph-mon[80107]: pgmap v1246: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 1 op/s
Jan 26 10:23:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:23:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:36.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:23:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:36.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:23:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:23:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:23:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:23:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:23:38 compute-1 ceph-mon[80107]: pgmap v1247: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 1 op/s
Jan 26 10:23:38 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:23:38 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:23:38 compute-1 sudo[247790]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:23:38 compute-1 sudo[247790]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:23:38 compute-1 sudo[247790]: pam_unix(sudo:session): session closed for user root
Jan 26 10:23:38 compute-1 podman[247814]: 2026-01-26 10:23:38.212581808 +0000 UTC m=+0.057997054 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 10:23:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:23:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:38.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:23:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:38.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:39 compute-1 ceph-mon[80107]: pgmap v1248: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 1 op/s
Jan 26 10:23:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:23:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:40.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:23:40 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/952901761' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:23:40 compute-1 nova_compute[226322]: 2026-01-26 10:23:40.571 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:23:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:23:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:40.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:23:41 compute-1 ceph-mon[80107]: pgmap v1249: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 1 op/s
Jan 26 10:23:41 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1780312808' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:23:42 compute-1 nova_compute[226322]: 2026-01-26 10:23:42.264 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:23:42 compute-1 nova_compute[226322]: 2026-01-26 10:23:42.265 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:23:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:23:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:42.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:23:42 compute-1 nova_compute[226322]: 2026-01-26 10:23:42.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:23:42 compute-1 nova_compute[226322]: 2026-01-26 10:23:42.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:23:42 compute-1 nova_compute[226322]: 2026-01-26 10:23:42.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:23:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:23:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:42.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:23:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:23:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:23:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:23:43 compute-1 ceph-mon[80107]: pgmap v1250: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 1 op/s
Jan 26 10:23:43 compute-1 nova_compute[226322]: 2026-01-26 10:23:43.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:23:43 compute-1 nova_compute[226322]: 2026-01-26 10:23:43.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:23:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:44.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:44 compute-1 nova_compute[226322]: 2026-01-26 10:23:44.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:23:44 compute-1 nova_compute[226322]: 2026-01-26 10:23:44.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:23:44 compute-1 nova_compute[226322]: 2026-01-26 10:23:44.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:23:44 compute-1 nova_compute[226322]: 2026-01-26 10:23:44.725 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 10:23:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:44.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:45 compute-1 sshd-session[247838]: Invalid user admin from 152.42.136.110 port 52998
Jan 26 10:23:45 compute-1 sshd-session[247838]: Connection closed by invalid user admin 152.42.136.110 port 52998 [preauth]
Jan 26 10:23:45 compute-1 nova_compute[226322]: 2026-01-26 10:23:45.573 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:23:45 compute-1 ceph-mon[80107]: pgmap v1251: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:23:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:46.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:46 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3486596600' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:23:46 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/611560939' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:23:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:46.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:47 compute-1 sudo[247842]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:23:47 compute-1 sudo[247842]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:23:47 compute-1 sudo[247842]: pam_unix(sudo:session): session closed for user root
Jan 26 10:23:47 compute-1 ceph-mon[80107]: pgmap v1252: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:23:47 compute-1 nova_compute[226322]: 2026-01-26 10:23:47.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:23:47 compute-1 nova_compute[226322]: 2026-01-26 10:23:47.705 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:23:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:23:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:23:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:23:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:23:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:23:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:23:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:48.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:23:48 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:23:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:23:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:48.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:23:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:23:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:50.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:23:50 compute-1 nova_compute[226322]: 2026-01-26 10:23:50.574 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:23:50 compute-1 ceph-mon[80107]: pgmap v1253: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:23:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:50.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:51 compute-1 nova_compute[226322]: 2026-01-26 10:23:51.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:23:51 compute-1 nova_compute[226322]: 2026-01-26 10:23:51.722 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:23:51 compute-1 nova_compute[226322]: 2026-01-26 10:23:51.723 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:23:51 compute-1 nova_compute[226322]: 2026-01-26 10:23:51.723 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:23:51 compute-1 nova_compute[226322]: 2026-01-26 10:23:51.724 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:23:51 compute-1 nova_compute[226322]: 2026-01-26 10:23:51.724 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:23:52 compute-1 ceph-mon[80107]: pgmap v1254: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:23:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:23:52 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/909490233' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:23:52 compute-1 nova_compute[226322]: 2026-01-26 10:23:52.165 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:23:52 compute-1 nova_compute[226322]: 2026-01-26 10:23:52.322 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:23:52 compute-1 nova_compute[226322]: 2026-01-26 10:23:52.324 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4837MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:23:52 compute-1 nova_compute[226322]: 2026-01-26 10:23:52.324 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:23:52 compute-1 nova_compute[226322]: 2026-01-26 10:23:52.325 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:23:52 compute-1 nova_compute[226322]: 2026-01-26 10:23:52.421 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:23:52 compute-1 nova_compute[226322]: 2026-01-26 10:23:52.422 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:23:52 compute-1 nova_compute[226322]: 2026-01-26 10:23:52.442 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:23:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:52.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:23:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:23:52 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4236003645' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:23:52 compute-1 nova_compute[226322]: 2026-01-26 10:23:52.886 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:23:52 compute-1 nova_compute[226322]: 2026-01-26 10:23:52.891 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:23:52 compute-1 nova_compute[226322]: 2026-01-26 10:23:52.904 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:23:52 compute-1 nova_compute[226322]: 2026-01-26 10:23:52.905 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:23:52 compute-1 nova_compute[226322]: 2026-01-26 10:23:52.906 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.581s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:23:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:52.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:23:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:23:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:23:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:23:53 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/909490233' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:23:53 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/4236003645' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:23:53 compute-1 podman[247914]: 2026-01-26 10:23:53.388124261 +0000 UTC m=+0.148692755 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 10:23:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:23:53.949 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:23:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:23:53.949 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:23:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:23:53.950 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:23:54 compute-1 ceph-mon[80107]: pgmap v1255: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:23:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:54.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:54.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:55 compute-1 nova_compute[226322]: 2026-01-26 10:23:55.576 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:23:56 compute-1 ceph-mon[80107]: pgmap v1256: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:23:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:56.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:56.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:57 compute-1 ceph-mon[80107]: pgmap v1257: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:23:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:23:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:23:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:23:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:23:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:23:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 10:23:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2050018793' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:23:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 10:23:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2050018793' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:23:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:23:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:58.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:23:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/2050018793' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:23:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/2050018793' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:23:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:23:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:23:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:58.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:23:59 compute-1 ceph-mon[80107]: pgmap v1258: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:24:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:00.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:00 compute-1 nova_compute[226322]: 2026-01-26 10:24:00.577 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:24:00 compute-1 nova_compute[226322]: 2026-01-26 10:24:00.580 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:24:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:00.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:02 compute-1 ceph-mon[80107]: pgmap v1259: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:24:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:24:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:02.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:24:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:24:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:24:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:02.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:24:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:24:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:24:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:24:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:24:04 compute-1 ceph-mon[80107]: pgmap v1260: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:24:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:24:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:04.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:04.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:05 compute-1 ceph-mon[80107]: pgmap v1261: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:24:05 compute-1 nova_compute[226322]: 2026-01-26 10:24:05.579 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:24:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:06.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:06.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:07 compute-1 sudo[247948]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:24:07 compute-1 sudo[247948]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:24:07 compute-1 sudo[247948]: pam_unix(sudo:session): session closed for user root
Jan 26 10:24:07 compute-1 ceph-mon[80107]: pgmap v1262: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:24:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:24:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:24:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:24:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:24:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:24:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:08.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:08.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:09 compute-1 podman[247974]: 2026-01-26 10:24:09.303692413 +0000 UTC m=+0.081111488 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 26 10:24:10 compute-1 ceph-mon[80107]: pgmap v1263: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:24:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:24:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:10.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:24:10 compute-1 nova_compute[226322]: 2026-01-26 10:24:10.583 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:24:10 compute-1 nova_compute[226322]: 2026-01-26 10:24:10.585 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:24:10 compute-1 nova_compute[226322]: 2026-01-26 10:24:10.585 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:24:10 compute-1 nova_compute[226322]: 2026-01-26 10:24:10.585 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:24:10 compute-1 nova_compute[226322]: 2026-01-26 10:24:10.629 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:24:10 compute-1 nova_compute[226322]: 2026-01-26 10:24:10.629 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:24:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:10.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:12 compute-1 ceph-mon[80107]: pgmap v1264: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:24:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:24:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:12.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:24:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:24:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:12.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:24:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:24:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:24:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:24:14 compute-1 ceph-mon[80107]: pgmap v1265: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:24:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:24:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:14.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:24:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:24:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:14.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:24:15 compute-1 nova_compute[226322]: 2026-01-26 10:24:15.631 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:24:16 compute-1 ceph-mon[80107]: pgmap v1266: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:24:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:16.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:16.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:17 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:24:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:24:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:24:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:24:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:24:18 compute-1 ceph-mon[80107]: pgmap v1267: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:24:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:18.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:18.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:24:20 compute-1 ceph-mon[80107]: pgmap v1268: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:24:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:20.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:20 compute-1 nova_compute[226322]: 2026-01-26 10:24:20.632 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:24:20 compute-1 nova_compute[226322]: 2026-01-26 10:24:20.634 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:24:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:20.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:21 compute-1 ceph-mon[80107]: pgmap v1269: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:24:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:22.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:24:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:24:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:22.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:24:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:24:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:24:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:24:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:24:24 compute-1 ceph-mon[80107]: pgmap v1270: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:24:24 compute-1 podman[248001]: 2026-01-26 10:24:24.337627973 +0000 UTC m=+0.114420272 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 10:24:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:24.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:24:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:24.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:24:25 compute-1 nova_compute[226322]: 2026-01-26 10:24:25.635 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:24:25 compute-1 nova_compute[226322]: 2026-01-26 10:24:25.637 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:24:25 compute-1 nova_compute[226322]: 2026-01-26 10:24:25.637 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:24:25 compute-1 nova_compute[226322]: 2026-01-26 10:24:25.637 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:24:25 compute-1 nova_compute[226322]: 2026-01-26 10:24:25.677 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:24:25 compute-1 nova_compute[226322]: 2026-01-26 10:24:25.677 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:24:26 compute-1 ceph-mon[80107]: pgmap v1271: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:24:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:26.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:26.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:27 compute-1 sudo[248028]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:24:27 compute-1 sudo[248028]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:24:27 compute-1 sudo[248028]: pam_unix(sudo:session): session closed for user root
Jan 26 10:24:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:24:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:24:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:24:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:24:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:24:28 compute-1 ceph-mon[80107]: pgmap v1272: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:24:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:24:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:28.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:24:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:24:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:28.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:24:30 compute-1 ceph-mon[80107]: pgmap v1273: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:24:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:24:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:30.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:24:30 compute-1 nova_compute[226322]: 2026-01-26 10:24:30.678 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:24:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:24:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:30.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:24:31 compute-1 ceph-mon[80107]: pgmap v1274: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:24:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:24:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:32.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:24:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:24:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:32.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:24:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:24:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:24:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:24:34 compute-1 ceph-mon[80107]: pgmap v1275: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:24:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:24:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:24:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:34.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:24:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:34.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:35 compute-1 ceph-mon[80107]: pgmap v1276: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:24:35 compute-1 nova_compute[226322]: 2026-01-26 10:24:35.680 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:24:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:36.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:36.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:37 compute-1 ceph-mon[80107]: pgmap v1277: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:24:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:24:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:24:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:24:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:24:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:24:38 compute-1 sudo[248059]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:24:38 compute-1 sudo[248059]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:24:38 compute-1 sudo[248059]: pam_unix(sudo:session): session closed for user root
Jan 26 10:24:38 compute-1 sudo[248084]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:24:38 compute-1 sudo[248084]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:24:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:24:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:38.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:24:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:24:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:38.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:24:39 compute-1 sudo[248084]: pam_unix(sudo:session): session closed for user root
Jan 26 10:24:39 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:24:39 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:24:39 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:24:39 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:24:39 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:24:39 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:24:39 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:24:40 compute-1 ceph-mon[80107]: pgmap v1278: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:24:40 compute-1 ceph-mon[80107]: pgmap v1279: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:24:40 compute-1 podman[248143]: 2026-01-26 10:24:40.276515946 +0000 UTC m=+0.051918576 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent)
Jan 26 10:24:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:40.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:40 compute-1 nova_compute[226322]: 2026-01-26 10:24:40.685 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:24:40 compute-1 nova_compute[226322]: 2026-01-26 10:24:40.686 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:24:40 compute-1 nova_compute[226322]: 2026-01-26 10:24:40.687 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:24:40 compute-1 nova_compute[226322]: 2026-01-26 10:24:40.687 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:24:40 compute-1 nova_compute[226322]: 2026-01-26 10:24:40.687 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:24:40 compute-1 nova_compute[226322]: 2026-01-26 10:24:40.688 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:24:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:40.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.197851) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423081197894, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 2015, "num_deletes": 251, "total_data_size": 5255356, "memory_usage": 5321232, "flush_reason": "Manual Compaction"}
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423081225054, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 3423062, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 37458, "largest_seqno": 39468, "table_properties": {"data_size": 3414852, "index_size": 5024, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 16999, "raw_average_key_size": 20, "raw_value_size": 3398409, "raw_average_value_size": 4036, "num_data_blocks": 218, "num_entries": 842, "num_filter_entries": 842, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769422900, "oldest_key_time": 1769422900, "file_creation_time": 1769423081, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 27368 microseconds, and 9720 cpu microseconds.
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.225190) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 3423062 bytes OK
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.225284) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.227672) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.227705) EVENT_LOG_v1 {"time_micros": 1769423081227689, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.227878) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 5246388, prev total WAL file size 5246388, number of live WAL files 2.
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.229456) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(3342KB)], [72(11MB)]
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423081229506, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 15274948, "oldest_snapshot_seqno": -1}
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6737 keys, 13078069 bytes, temperature: kUnknown
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423081315383, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 13078069, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13035403, "index_size": 24698, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16901, "raw_key_size": 176945, "raw_average_key_size": 26, "raw_value_size": 12916299, "raw_average_value_size": 1917, "num_data_blocks": 970, "num_entries": 6737, "num_filter_entries": 6737, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769423081, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.315600) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 13078069 bytes
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.317462) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 177.7 rd, 152.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.3, 11.3 +0.0 blob) out(12.5 +0.0 blob), read-write-amplify(8.3) write-amplify(3.8) OK, records in: 7253, records dropped: 516 output_compression: NoCompression
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.317479) EVENT_LOG_v1 {"time_micros": 1769423081317471, "job": 44, "event": "compaction_finished", "compaction_time_micros": 85938, "compaction_time_cpu_micros": 31860, "output_level": 6, "num_output_files": 1, "total_output_size": 13078069, "num_input_records": 7253, "num_output_records": 6737, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423081318046, "job": 44, "event": "table_file_deletion", "file_number": 74}
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423081320210, "job": 44, "event": "table_file_deletion", "file_number": 72}
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.229371) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.320277) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.320281) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.320282) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.320284) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:24:41 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.320285) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:24:42 compute-1 ceph-mon[80107]: pgmap v1280: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:24:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:42.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:24:42 compute-1 nova_compute[226322]: 2026-01-26 10:24:42.907 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:24:42 compute-1 nova_compute[226322]: 2026-01-26 10:24:42.908 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:24:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:42.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:24:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:24:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:24:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:24:43 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/4165630013' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:24:43 compute-1 nova_compute[226322]: 2026-01-26 10:24:43.684 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:24:43 compute-1 nova_compute[226322]: 2026-01-26 10:24:43.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:24:43 compute-1 nova_compute[226322]: 2026-01-26 10:24:43.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:24:43 compute-1 nova_compute[226322]: 2026-01-26 10:24:43.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:24:43 compute-1 sudo[248163]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:24:43 compute-1 sudo[248163]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:24:43 compute-1 sudo[248163]: pam_unix(sudo:session): session closed for user root
Jan 26 10:24:44 compute-1 ceph-mon[80107]: pgmap v1281: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:24:44 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2617897557' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:24:44 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:24:44 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:24:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:44.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:44 compute-1 nova_compute[226322]: 2026-01-26 10:24:44.688 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:24:44 compute-1 nova_compute[226322]: 2026-01-26 10:24:44.689 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:24:44 compute-1 nova_compute[226322]: 2026-01-26 10:24:44.689 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:24:44 compute-1 nova_compute[226322]: 2026-01-26 10:24:44.716 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 10:24:44 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:44 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:44 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:44.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:45 compute-1 nova_compute[226322]: 2026-01-26 10:24:45.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:24:45 compute-1 nova_compute[226322]: 2026-01-26 10:24:45.688 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:24:45 compute-1 nova_compute[226322]: 2026-01-26 10:24:45.792 226326 DEBUG oslo_concurrency.processutils [None req-695a35b5-cbaf-43c3-a71b-bf8c928be5ef c2f0bcfebfa24487b4079cc85d8950ce 3ff3fa2a5531460b993c609589aa545d - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:24:45 compute-1 nova_compute[226322]: 2026-01-26 10:24:45.822 226326 DEBUG oslo_concurrency.processutils [None req-695a35b5-cbaf-43c3-a71b-bf8c928be5ef c2f0bcfebfa24487b4079cc85d8950ce 3ff3fa2a5531460b993c609589aa545d - - default default] CMD "env LANG=C uptime" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:24:46 compute-1 ceph-mon[80107]: pgmap v1282: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:24:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:24:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:46.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:24:46 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:46 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:46 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:46.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:47 compute-1 sudo[248190]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:24:47 compute-1 sudo[248190]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:24:47 compute-1 sudo[248190]: pam_unix(sudo:session): session closed for user root
Jan 26 10:24:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:24:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:24:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:24:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:24:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:24:48 compute-1 ceph-mon[80107]: pgmap v1283: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:24:48 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1158727383' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:24:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:48.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:48 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:48 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:48 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:48.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:49 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2989598300' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:24:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:24:49 compute-1 nova_compute[226322]: 2026-01-26 10:24:49.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:24:50 compute-1 ceph-mon[80107]: pgmap v1284: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:24:50 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:50 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:50 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:50.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:50 compute-1 nova_compute[226322]: 2026-01-26 10:24:50.691 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:24:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:51.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:51 compute-1 ceph-mon[80107]: pgmap v1285: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:24:52 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:52 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:52 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:52.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:52 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:24:52.557 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 10:24:52 compute-1 nova_compute[226322]: 2026-01-26 10:24:52.558 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:24:52 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:24:52.559 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 10:24:52 compute-1 nova_compute[226322]: 2026-01-26 10:24:52.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:24:52 compute-1 nova_compute[226322]: 2026-01-26 10:24:52.738 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:24:52 compute-1 nova_compute[226322]: 2026-01-26 10:24:52.739 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:24:52 compute-1 nova_compute[226322]: 2026-01-26 10:24:52.739 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:24:52 compute-1 nova_compute[226322]: 2026-01-26 10:24:52.740 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:24:52 compute-1 nova_compute[226322]: 2026-01-26 10:24:52.740 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:24:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:24:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:24:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:24:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:24:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:24:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:53.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:53 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:24:53 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3808329144' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:24:53 compute-1 nova_compute[226322]: 2026-01-26 10:24:53.239 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:24:53 compute-1 nova_compute[226322]: 2026-01-26 10:24:53.431 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:24:53 compute-1 nova_compute[226322]: 2026-01-26 10:24:53.432 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4848MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:24:53 compute-1 nova_compute[226322]: 2026-01-26 10:24:53.432 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:24:53 compute-1 nova_compute[226322]: 2026-01-26 10:24:53.433 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:24:53 compute-1 nova_compute[226322]: 2026-01-26 10:24:53.493 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:24:53 compute-1 nova_compute[226322]: 2026-01-26 10:24:53.494 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:24:53 compute-1 nova_compute[226322]: 2026-01-26 10:24:53.512 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:24:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:24:53.950 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:24:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:24:53.951 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:24:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:24:53.952 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:24:53 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:24:53 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/11944970' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:24:53 compute-1 nova_compute[226322]: 2026-01-26 10:24:53.979 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:24:53 compute-1 nova_compute[226322]: 2026-01-26 10:24:53.985 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:24:54 compute-1 nova_compute[226322]: 2026-01-26 10:24:54.003 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:24:54 compute-1 nova_compute[226322]: 2026-01-26 10:24:54.006 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:24:54 compute-1 nova_compute[226322]: 2026-01-26 10:24:54.007 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:24:54 compute-1 ceph-mon[80107]: pgmap v1286: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:24:54 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3808329144' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:24:54 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/11944970' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:24:54 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:54 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:54 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:54.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:24:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:55.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:24:55 compute-1 podman[248263]: 2026-01-26 10:24:55.376420219 +0000 UTC m=+0.142427271 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 26 10:24:55 compute-1 nova_compute[226322]: 2026-01-26 10:24:55.692 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:24:55 compute-1 nova_compute[226322]: 2026-01-26 10:24:55.694 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:24:56 compute-1 ceph-mon[80107]: pgmap v1287: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:24:56 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:56 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:24:56 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:56.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:24:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:57.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:24:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:24:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:24:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:24:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:24:58 compute-1 ceph-mon[80107]: pgmap v1288: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:24:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/4159726104' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:24:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/4159726104' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:24:58 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:58 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:58 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:58.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:24:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:24:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:59.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:24:59 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:24:59.561 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 10:25:00 compute-1 ceph-mon[80107]: pgmap v1289: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:25:00 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:00 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:25:00 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:00.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:25:00 compute-1 nova_compute[226322]: 2026-01-26 10:25:00.694 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:25:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:01.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:02 compute-1 ceph-mon[80107]: pgmap v1290: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:25:02 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:02 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:25:02 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:02.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:25:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:25:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:25:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:25:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:25:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:25:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:25:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:03.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:25:04 compute-1 ceph-mon[80107]: pgmap v1291: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:25:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:25:04 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:04 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:04 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:04.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:25:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:05.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:25:05 compute-1 ceph-mon[80107]: pgmap v1292: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:25:05 compute-1 nova_compute[226322]: 2026-01-26 10:25:05.695 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:25:06 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:06 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:06 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:06.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:07.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:07 compute-1 sudo[248295]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:25:07 compute-1 sudo[248295]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:25:07 compute-1 sudo[248295]: pam_unix(sudo:session): session closed for user root
Jan 26 10:25:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:25:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:25:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:25:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:25:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:25:08 compute-1 ceph-mon[80107]: pgmap v1293: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:25:08 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:08 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:25:08 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:08.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:25:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:25:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:09.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:25:10 compute-1 ceph-mon[80107]: pgmap v1294: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:25:10 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:10 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:10 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:10.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:10 compute-1 nova_compute[226322]: 2026-01-26 10:25:10.698 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:25:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:11.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:11 compute-1 podman[248322]: 2026-01-26 10:25:11.295561501 +0000 UTC m=+0.074773034 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 10:25:12 compute-1 ceph-mon[80107]: pgmap v1295: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:25:12 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:12 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:25:12 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:12.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:25:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:25:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:25:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:25:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:25:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:25:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:13.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:14 compute-1 ceph-mon[80107]: pgmap v1296: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:25:14 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:14 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:14 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:14.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:15.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:15 compute-1 nova_compute[226322]: 2026-01-26 10:25:15.700 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:25:15 compute-1 nova_compute[226322]: 2026-01-26 10:25:15.702 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:25:15 compute-1 nova_compute[226322]: 2026-01-26 10:25:15.702 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:25:15 compute-1 nova_compute[226322]: 2026-01-26 10:25:15.702 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:25:15 compute-1 nova_compute[226322]: 2026-01-26 10:25:15.731 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:25:15 compute-1 nova_compute[226322]: 2026-01-26 10:25:15.731 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:25:16 compute-1 ceph-mon[80107]: pgmap v1297: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:25:16 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:16 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:16 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:16.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:17.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:17 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:25:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:25:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:25:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:25:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:25:18 compute-1 ceph-mon[80107]: pgmap v1298: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:25:18 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:18 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:18 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:18.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:19.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:25:20 compute-1 ceph-mon[80107]: pgmap v1299: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:25:20 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:20 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:25:20 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:20.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:25:20 compute-1 nova_compute[226322]: 2026-01-26 10:25:20.732 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:25:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:21.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:21 compute-1 ceph-mon[80107]: pgmap v1300: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:25:22 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:22 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:22 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:22.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:25:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:25:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:25:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:25:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:25:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:25:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:23.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:25:24 compute-1 ceph-mon[80107]: pgmap v1301: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:25:24 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:24 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:24 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:24.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:25.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:25 compute-1 nova_compute[226322]: 2026-01-26 10:25:25.733 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:25:26 compute-1 ceph-mon[80107]: pgmap v1302: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:25:26 compute-1 podman[248349]: 2026-01-26 10:25:26.290860779 +0000 UTC m=+0.071707710 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 26 10:25:26 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:26 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:26 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:26.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:27.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:25:27 compute-1 sudo[248377]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:25:27 compute-1 sudo[248377]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:25:27 compute-1 sudo[248377]: pam_unix(sudo:session): session closed for user root
Jan 26 10:25:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:25:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:25:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:25:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:25:28 compute-1 ceph-mon[80107]: pgmap v1303: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:25:28 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:28 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:28 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:28.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:29.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:30 compute-1 ceph-mon[80107]: pgmap v1304: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:25:30 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:30 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:25:30 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:30.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:25:30 compute-1 nova_compute[226322]: 2026-01-26 10:25:30.735 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:25:30 compute-1 nova_compute[226322]: 2026-01-26 10:25:30.776 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:25:30 compute-1 nova_compute[226322]: 2026-01-26 10:25:30.776 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5042 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:25:30 compute-1 nova_compute[226322]: 2026-01-26 10:25:30.776 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:25:30 compute-1 nova_compute[226322]: 2026-01-26 10:25:30.778 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:25:30 compute-1 nova_compute[226322]: 2026-01-26 10:25:30.779 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:25:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:25:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:31.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:25:31 compute-1 ceph-mon[80107]: pgmap v1305: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:25:32 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:32 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 10:25:32 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:32.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 10:25:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:25:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:25:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:25:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:25:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:25:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:33.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:34 compute-1 ceph-mon[80107]: pgmap v1306: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:25:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:25:34 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:34 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:25:34 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:34.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:25:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:35.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:35 compute-1 nova_compute[226322]: 2026-01-26 10:25:35.780 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:25:36 compute-1 ceph-mon[80107]: pgmap v1307: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:25:36 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:36 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:25:36 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:36.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:25:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:37.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:25:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:25:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:25:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:25:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:25:38 compute-1 ceph-mon[80107]: pgmap v1308: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:25:38 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:38 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:38 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:38.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:25:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:39.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:25:40 compute-1 ceph-mon[80107]: pgmap v1309: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:25:40 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:40 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:25:40 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:40.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:25:40 compute-1 nova_compute[226322]: 2026-01-26 10:25:40.782 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:25:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:41.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:42 compute-1 ceph-mon[80107]: pgmap v1310: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:25:42 compute-1 podman[248409]: 2026-01-26 10:25:42.281935017 +0000 UTC m=+0.058712603 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 26 10:25:42 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:42 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:25:42 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:42.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:25:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:25:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:25:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:25:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:25:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:25:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:43.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:43 compute-1 sudo[248430]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:25:43 compute-1 sudo[248430]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:25:43 compute-1 sudo[248430]: pam_unix(sudo:session): session closed for user root
Jan 26 10:25:44 compute-1 nova_compute[226322]: 2026-01-26 10:25:44.007 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:25:44 compute-1 nova_compute[226322]: 2026-01-26 10:25:44.008 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:25:44 compute-1 nova_compute[226322]: 2026-01-26 10:25:44.008 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:25:44 compute-1 nova_compute[226322]: 2026-01-26 10:25:44.008 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:25:44 compute-1 nova_compute[226322]: 2026-01-26 10:25:44.009 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:25:44 compute-1 sudo[248455]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 26 10:25:44 compute-1 sudo[248455]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:25:44 compute-1 ceph-mon[80107]: pgmap v1311: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:25:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:45.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:25:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:45 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:45.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:45 compute-1 nova_compute[226322]: 2026-01-26 10:25:45.328 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:25:45 compute-1 nova_compute[226322]: 2026-01-26 10:25:45.330 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:25:45 compute-1 nova_compute[226322]: 2026-01-26 10:25:45.330 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:25:45 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/4072745689' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:25:45 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 26 10:25:45 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/490763872' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:25:45 compute-1 nova_compute[226322]: 2026-01-26 10:25:45.356 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 10:25:45 compute-1 podman[248553]: 2026-01-26 10:25:45.368067396 +0000 UTC m=+0.839612593 container exec 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Jan 26 10:25:45 compute-1 podman[248553]: 2026-01-26 10:25:45.468615556 +0000 UTC m=+0.940160823 container exec_died 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid)
Jan 26 10:25:45 compute-1 nova_compute[226322]: 2026-01-26 10:25:45.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:25:45 compute-1 nova_compute[226322]: 2026-01-26 10:25:45.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:25:45 compute-1 nova_compute[226322]: 2026-01-26 10:25:45.784 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:25:45 compute-1 nova_compute[226322]: 2026-01-26 10:25:45.785 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:25:45 compute-1 nova_compute[226322]: 2026-01-26 10:25:45.785 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:25:45 compute-1 nova_compute[226322]: 2026-01-26 10:25:45.785 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:25:45 compute-1 nova_compute[226322]: 2026-01-26 10:25:45.786 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:25:45 compute-1 nova_compute[226322]: 2026-01-26 10:25:45.786 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:25:46 compute-1 podman[248691]: 2026-01-26 10:25:46.011788479 +0000 UTC m=+0.058425095 container exec 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 10:25:46 compute-1 podman[248691]: 2026-01-26 10:25:46.019876761 +0000 UTC m=+0.066513377 container exec_died 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 10:25:46 compute-1 podman[248762]: 2026-01-26 10:25:46.295457317 +0000 UTC m=+0.066862967 container exec 2481bf60245d7c2e187aaa8c0a64c69d854b50d0db3801811d86e7df60b7c5b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 10:25:46 compute-1 podman[248762]: 2026-01-26 10:25:46.334293333 +0000 UTC m=+0.105698943 container exec_died 2481bf60245d7c2e187aaa8c0a64c69d854b50d0db3801811d86e7df60b7c5b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 10:25:46 compute-1 ceph-mon[80107]: pgmap v1312: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:25:46 compute-1 podman[248828]: 2026-01-26 10:25:46.597119219 +0000 UTC m=+0.073168221 container exec 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 10:25:46 compute-1 podman[248828]: 2026-01-26 10:25:46.613150338 +0000 UTC m=+0.089199330 container exec_died 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 10:25:46 compute-1 podman[248896]: 2026-01-26 10:25:46.858303259 +0000 UTC m=+0.068703677 container exec 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., release=1793, com.redhat.component=keepalived-container, version=2.2.4, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, description=keepalived for Ceph, distribution-scope=public, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Jan 26 10:25:46 compute-1 podman[248896]: 2026-01-26 10:25:46.879185552 +0000 UTC m=+0.089585940 container exec_died 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, architecture=x86_64, description=keepalived for Ceph, distribution-scope=public, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, vcs-type=git, name=keepalived, release=1793)
Jan 26 10:25:46 compute-1 sudo[248455]: pam_unix(sudo:session): session closed for user root
Jan 26 10:25:47 compute-1 sudo[248929]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:25:47 compute-1 sudo[248929]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:25:47 compute-1 sudo[248929]: pam_unix(sudo:session): session closed for user root
Jan 26 10:25:47 compute-1 sudo[248954]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:25:47 compute-1 sudo[248954]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:25:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:47.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:25:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:47.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:25:47 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:25:47 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:25:47 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:25:47 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:25:47 compute-1 ceph-mon[80107]: pgmap v1313: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:25:47 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 26 10:25:47 compute-1 sudo[248954]: pam_unix(sudo:session): session closed for user root
Jan 26 10:25:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:25:47 compute-1 sudo[249011]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:25:47 compute-1 sudo[249011]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:25:47 compute-1 sudo[249011]: pam_unix(sudo:session): session closed for user root
Jan 26 10:25:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:25:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:25:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:25:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:25:48 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:25:48 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1077740556' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:25:48 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 26 10:25:48 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:25:48 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:25:48 compute-1 ceph-mon[80107]: pgmap v1314: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 1 op/s
Jan 26 10:25:48 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:25:48 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:25:48 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:25:48 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:25:48 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:25:48 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1077740556' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:25:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:49.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:49.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:25:49 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3929538449' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:25:49 compute-1 nova_compute[226322]: 2026-01-26 10:25:49.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:25:50 compute-1 ceph-mon[80107]: pgmap v1315: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 1 op/s
Jan 26 10:25:50 compute-1 nova_compute[226322]: 2026-01-26 10:25:50.788 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:25:50 compute-1 nova_compute[226322]: 2026-01-26 10:25:50.790 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:25:50 compute-1 nova_compute[226322]: 2026-01-26 10:25:50.790 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:25:50 compute-1 nova_compute[226322]: 2026-01-26 10:25:50.790 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:25:50 compute-1 nova_compute[226322]: 2026-01-26 10:25:50.812 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:25:50 compute-1 nova_compute[226322]: 2026-01-26 10:25:50.813 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:25:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:51.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:25:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:51.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:25:52 compute-1 sudo[249038]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:25:52 compute-1 sudo[249038]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:25:52 compute-1 sudo[249038]: pam_unix(sudo:session): session closed for user root
Jan 26 10:25:52 compute-1 nova_compute[226322]: 2026-01-26 10:25:52.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:25:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:25:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:25:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:25:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:25:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:25:53 compute-1 ceph-mon[80107]: pgmap v1316: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 1 op/s
Jan 26 10:25:53 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:25:53 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:25:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:25:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:53.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:25:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:53.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:25:53.951 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:25:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:25:53.954 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:25:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:25:53.955 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:25:54 compute-1 nova_compute[226322]: 2026-01-26 10:25:54.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:25:54 compute-1 nova_compute[226322]: 2026-01-26 10:25:54.708 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:25:54 compute-1 nova_compute[226322]: 2026-01-26 10:25:54.708 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:25:54 compute-1 nova_compute[226322]: 2026-01-26 10:25:54.709 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:25:54 compute-1 nova_compute[226322]: 2026-01-26 10:25:54.710 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:25:54 compute-1 nova_compute[226322]: 2026-01-26 10:25:54.710 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:25:55 compute-1 ceph-mon[80107]: pgmap v1317: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 1 op/s
Jan 26 10:25:55 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:25:55 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2985707705' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:25:55 compute-1 nova_compute[226322]: 2026-01-26 10:25:55.149 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:25:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:55.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:55.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:55 compute-1 nova_compute[226322]: 2026-01-26 10:25:55.354 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:25:55 compute-1 nova_compute[226322]: 2026-01-26 10:25:55.355 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4797MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:25:55 compute-1 nova_compute[226322]: 2026-01-26 10:25:55.355 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:25:55 compute-1 nova_compute[226322]: 2026-01-26 10:25:55.355 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:25:55 compute-1 nova_compute[226322]: 2026-01-26 10:25:55.417 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:25:55 compute-1 nova_compute[226322]: 2026-01-26 10:25:55.417 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:25:55 compute-1 nova_compute[226322]: 2026-01-26 10:25:55.440 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:25:55 compute-1 nova_compute[226322]: 2026-01-26 10:25:55.815 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:25:55 compute-1 nova_compute[226322]: 2026-01-26 10:25:55.819 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:25:55 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:25:55 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2820156879' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:25:55 compute-1 nova_compute[226322]: 2026-01-26 10:25:55.972 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.532s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:25:55 compute-1 nova_compute[226322]: 2026-01-26 10:25:55.979 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:25:55 compute-1 nova_compute[226322]: 2026-01-26 10:25:55.996 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:25:56 compute-1 nova_compute[226322]: 2026-01-26 10:25:55.999 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:25:56 compute-1 nova_compute[226322]: 2026-01-26 10:25:56.000 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.645s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:25:56 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2985707705' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:25:56 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2820156879' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:25:57 compute-1 ceph-mon[80107]: pgmap v1318: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 1 op/s
Jan 26 10:25:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:57.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:57.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:57 compute-1 podman[249109]: 2026-01-26 10:25:57.363739922 +0000 UTC m=+0.131088670 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 26 10:25:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:25:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:25:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:25:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:25:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:25:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 10:25:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1247078311' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:25:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 10:25:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1247078311' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:25:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/1247078311' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:25:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/1247078311' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:25:59 compute-1 ceph-mon[80107]: pgmap v1319: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 1 op/s
Jan 26 10:25:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:59.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:25:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:25:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:25:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:59.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:00 compute-1 nova_compute[226322]: 2026-01-26 10:26:00.818 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:26:01 compute-1 ceph-mon[80107]: pgmap v1320: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:26:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:01.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:01.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:26:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:26:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:26:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:26:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:26:03 compute-1 ceph-mon[80107]: pgmap v1321: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:26:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:03.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:03.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:26:05 compute-1 ceph-mon[80107]: pgmap v1322: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:26:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:05.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:26:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:05.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:26:05 compute-1 nova_compute[226322]: 2026-01-26 10:26:05.820 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:26:05 compute-1 nova_compute[226322]: 2026-01-26 10:26:05.822 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:26:05 compute-1 nova_compute[226322]: 2026-01-26 10:26:05.823 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:26:05 compute-1 nova_compute[226322]: 2026-01-26 10:26:05.823 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:26:05 compute-1 nova_compute[226322]: 2026-01-26 10:26:05.823 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:26:05 compute-1 nova_compute[226322]: 2026-01-26 10:26:05.825 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:26:06 compute-1 sshd-session[249137]: Connection reset by 198.235.24.70 port 57822 [preauth]
Jan 26 10:26:07 compute-1 ceph-mon[80107]: pgmap v1323: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:26:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:26:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:07.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:26:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:07.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:26:07 compute-1 sudo[249143]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:26:07 compute-1 sudo[249143]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:26:07 compute-1 sudo[249143]: pam_unix(sudo:session): session closed for user root
Jan 26 10:26:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:26:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:26:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:26:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:26:08 compute-1 ceph-mon[80107]: pgmap v1324: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:26:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:09.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:09.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:10 compute-1 ceph-mon[80107]: pgmap v1325: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:26:10 compute-1 nova_compute[226322]: 2026-01-26 10:26:10.825 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:26:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:11.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:11.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:26:12 compute-1 ceph-mon[80107]: pgmap v1326: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:26:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:26:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:26:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:26:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:26:13 compute-1 podman[249170]: 2026-01-26 10:26:13.292276403 +0000 UTC m=+0.058013273 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 10:26:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:13.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:13.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:14 compute-1 ceph-mon[80107]: pgmap v1327: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:26:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:15.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:15.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:15 compute-1 nova_compute[226322]: 2026-01-26 10:26:15.827 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:26:15 compute-1 nova_compute[226322]: 2026-01-26 10:26:15.828 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:26:15 compute-1 nova_compute[226322]: 2026-01-26 10:26:15.828 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:26:15 compute-1 nova_compute[226322]: 2026-01-26 10:26:15.828 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:26:15 compute-1 nova_compute[226322]: 2026-01-26 10:26:15.828 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:26:16 compute-1 ceph-mon[80107]: pgmap v1328: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:26:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:17.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:26:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:17.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:26:17 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:26:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:26:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:26:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:26:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:26:18 compute-1 ceph-mon[80107]: pgmap v1329: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:26:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:26:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:26:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:19.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:26:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:19.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:20 compute-1 nova_compute[226322]: 2026-01-26 10:26:20.829 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:26:20 compute-1 ceph-mon[80107]: pgmap v1330: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:26:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:21.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:21.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:26:22 compute-1 ceph-mon[80107]: pgmap v1331: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:26:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:26:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:26:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:26:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:26:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:23.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:26:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:23.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:26:24 compute-1 ceph-mon[80107]: pgmap v1332: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:26:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:25.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:25.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:25 compute-1 nova_compute[226322]: 2026-01-26 10:26:25.830 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:26:26 compute-1 ceph-mon[80107]: pgmap v1333: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:26:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:26:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:27.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:26:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:27.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:26:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:26:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:26:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:26:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:26:28 compute-1 sudo[249197]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:26:28 compute-1 sudo[249197]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:26:28 compute-1 sudo[249197]: pam_unix(sudo:session): session closed for user root
Jan 26 10:26:28 compute-1 podman[249221]: 2026-01-26 10:26:28.200717542 +0000 UTC m=+0.125534843 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 26 10:26:28 compute-1 ceph-mon[80107]: pgmap v1334: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:26:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:29.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:29.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:30 compute-1 nova_compute[226322]: 2026-01-26 10:26:30.832 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:26:30 compute-1 ceph-mon[80107]: pgmap v1335: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:26:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:31.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:31.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:26:32 compute-1 ceph-mon[80107]: pgmap v1336: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:26:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:26:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:26:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:26:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:26:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:26:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:33.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:26:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:26:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:33.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:26:33 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:26:34 compute-1 ceph-mon[80107]: pgmap v1337: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:26:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:35.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:35.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:35 compute-1 nova_compute[226322]: 2026-01-26 10:26:35.835 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:26:37 compute-1 ceph-mon[80107]: pgmap v1338: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:26:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:37.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:26:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:37.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:26:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:26:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:26:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:26:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:26:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:26:39 compute-1 ceph-mon[80107]: pgmap v1339: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:26:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:39.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:39.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:40 compute-1 nova_compute[226322]: 2026-01-26 10:26:40.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:26:40 compute-1 nova_compute[226322]: 2026-01-26 10:26:40.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 26 10:26:40 compute-1 nova_compute[226322]: 2026-01-26 10:26:40.836 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:26:40 compute-1 nova_compute[226322]: 2026-01-26 10:26:40.837 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:26:40 compute-1 nova_compute[226322]: 2026-01-26 10:26:40.837 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:26:40 compute-1 nova_compute[226322]: 2026-01-26 10:26:40.838 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:26:40 compute-1 nova_compute[226322]: 2026-01-26 10:26:40.838 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:26:40 compute-1 nova_compute[226322]: 2026-01-26 10:26:40.839 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:26:41 compute-1 ceph-mon[80107]: pgmap v1340: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:26:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:26:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:41.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:26:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:26:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:41.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:26:42 compute-1 nova_compute[226322]: 2026-01-26 10:26:42.724 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:26:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:26:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:26:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:26:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:26:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:26:43 compute-1 ceph-mon[80107]: pgmap v1341: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:26:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:26:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:43.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:26:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:43.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:43 compute-1 nova_compute[226322]: 2026-01-26 10:26:43.683 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:26:43 compute-1 nova_compute[226322]: 2026-01-26 10:26:43.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:26:43 compute-1 nova_compute[226322]: 2026-01-26 10:26:43.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:26:43 compute-1 nova_compute[226322]: 2026-01-26 10:26:43.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:26:45 compute-1 podman[249258]: 2026-01-26 10:26:45.027388598 +0000 UTC m=+0.080894703 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 10:26:45 compute-1 ceph-mon[80107]: pgmap v1342: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:26:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:45.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:45.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:45 compute-1 nova_compute[226322]: 2026-01-26 10:26:45.840 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:26:45 compute-1 nova_compute[226322]: 2026-01-26 10:26:45.841 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:26:45 compute-1 nova_compute[226322]: 2026-01-26 10:26:45.841 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:26:45 compute-1 nova_compute[226322]: 2026-01-26 10:26:45.842 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:26:45 compute-1 nova_compute[226322]: 2026-01-26 10:26:45.842 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:26:45 compute-1 nova_compute[226322]: 2026-01-26 10:26:45.843 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:26:46 compute-1 nova_compute[226322]: 2026-01-26 10:26:46.689 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:26:46 compute-1 nova_compute[226322]: 2026-01-26 10:26:46.689 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:26:46 compute-1 nova_compute[226322]: 2026-01-26 10:26:46.689 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:26:46 compute-1 nova_compute[226322]: 2026-01-26 10:26:46.712 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 10:26:46 compute-1 nova_compute[226322]: 2026-01-26 10:26:46.713 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:26:47 compute-1 ceph-mon[80107]: pgmap v1343: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:26:47 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1881061805' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:26:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:26:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:47.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:26:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:26:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:47.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:26:47 compute-1 nova_compute[226322]: 2026-01-26 10:26:47.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:26:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:26:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:26:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:26:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:26:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:26:48 compute-1 sudo[249279]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:26:48 compute-1 sudo[249279]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:26:48 compute-1 sudo[249279]: pam_unix(sudo:session): session closed for user root
Jan 26 10:26:48 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1218141819' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:26:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:49.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:49.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:49 compute-1 ceph-mon[80107]: pgmap v1344: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:26:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:26:49 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3868117298' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:26:50 compute-1 nova_compute[226322]: 2026-01-26 10:26:50.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:26:50 compute-1 nova_compute[226322]: 2026-01-26 10:26:50.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:26:50 compute-1 nova_compute[226322]: 2026-01-26 10:26:50.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 26 10:26:50 compute-1 nova_compute[226322]: 2026-01-26 10:26:50.717 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 26 10:26:50 compute-1 nova_compute[226322]: 2026-01-26 10:26:50.843 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:26:50 compute-1 nova_compute[226322]: 2026-01-26 10:26:50.845 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:26:50 compute-1 nova_compute[226322]: 2026-01-26 10:26:50.845 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:26:50 compute-1 nova_compute[226322]: 2026-01-26 10:26:50.846 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:26:50 compute-1 nova_compute[226322]: 2026-01-26 10:26:50.876 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:26:50 compute-1 nova_compute[226322]: 2026-01-26 10:26:50.877 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:26:50 compute-1 ceph-mon[80107]: pgmap v1345: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:26:50 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3891417777' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:26:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:26:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:51.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:26:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:51.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:52 compute-1 sudo[249306]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:26:52 compute-1 sudo[249306]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:26:52 compute-1 sudo[249306]: pam_unix(sudo:session): session closed for user root
Jan 26 10:26:52 compute-1 sudo[249331]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:26:52 compute-1 sudo[249331]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:26:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:26:52 compute-1 ceph-mon[80107]: pgmap v1346: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:26:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:26:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:26:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:26:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:26:53 compute-1 sudo[249331]: pam_unix(sudo:session): session closed for user root
Jan 26 10:26:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:26:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:53.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:26:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:53.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:26:53.953 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:26:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:26:53.954 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:26:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:26:53.954 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:26:53 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:26:53 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:26:53 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:26:53 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:26:53 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:26:53 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:26:53 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:26:54 compute-1 ceph-mon[80107]: pgmap v1347: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.104945) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423215105015, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 1586, "num_deletes": 257, "total_data_size": 3994264, "memory_usage": 4044808, "flush_reason": "Manual Compaction"}
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423215120693, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 2599346, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39473, "largest_seqno": 41054, "table_properties": {"data_size": 2592642, "index_size": 3839, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 14234, "raw_average_key_size": 19, "raw_value_size": 2579060, "raw_average_value_size": 3612, "num_data_blocks": 165, "num_entries": 714, "num_filter_entries": 714, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769423082, "oldest_key_time": 1769423082, "file_creation_time": 1769423215, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 15820 microseconds, and 6389 cpu microseconds.
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.120776) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 2599346 bytes OK
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.120796) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.125544) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.125564) EVENT_LOG_v1 {"time_micros": 1769423215125558, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.125582) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 3986937, prev total WAL file size 3986937, number of live WAL files 2.
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.126729) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303031' seq:72057594037927935, type:22 .. '6C6F676D0031323534' seq:0, type:0; will stop at (end)
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(2538KB)], [75(12MB)]
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423215126776, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 15677415, "oldest_snapshot_seqno": -1}
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6919 keys, 15513175 bytes, temperature: kUnknown
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423215200991, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 15513175, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15466854, "index_size": 27905, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17349, "raw_key_size": 181700, "raw_average_key_size": 26, "raw_value_size": 15342100, "raw_average_value_size": 2217, "num_data_blocks": 1103, "num_entries": 6919, "num_filter_entries": 6919, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769423215, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.201416) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 15513175 bytes
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.202999) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 210.7 rd, 208.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.5, 12.5 +0.0 blob) out(14.8 +0.0 blob), read-write-amplify(12.0) write-amplify(6.0) OK, records in: 7451, records dropped: 532 output_compression: NoCompression
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.203029) EVENT_LOG_v1 {"time_micros": 1769423215203016, "job": 46, "event": "compaction_finished", "compaction_time_micros": 74404, "compaction_time_cpu_micros": 34123, "output_level": 6, "num_output_files": 1, "total_output_size": 15513175, "num_input_records": 7451, "num_output_records": 6919, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423215204170, "job": 46, "event": "table_file_deletion", "file_number": 77}
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423215208626, "job": 46, "event": "table_file_deletion", "file_number": 75}
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.126576) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.208818) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.208824) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.208825) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.208827) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:26:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.208828) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:26:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:26:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:55.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:26:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:55.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:55 compute-1 nova_compute[226322]: 2026-01-26 10:26:55.717 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:26:55 compute-1 nova_compute[226322]: 2026-01-26 10:26:55.745 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:26:55 compute-1 nova_compute[226322]: 2026-01-26 10:26:55.745 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:26:55 compute-1 nova_compute[226322]: 2026-01-26 10:26:55.746 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:26:55 compute-1 nova_compute[226322]: 2026-01-26 10:26:55.746 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:26:55 compute-1 nova_compute[226322]: 2026-01-26 10:26:55.746 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:26:55 compute-1 nova_compute[226322]: 2026-01-26 10:26:55.926 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:26:56 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:26:56 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1505490843' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:26:56 compute-1 nova_compute[226322]: 2026-01-26 10:26:56.236 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:26:56 compute-1 nova_compute[226322]: 2026-01-26 10:26:56.409 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:26:56 compute-1 nova_compute[226322]: 2026-01-26 10:26:56.410 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4843MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:26:56 compute-1 nova_compute[226322]: 2026-01-26 10:26:56.410 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:26:56 compute-1 nova_compute[226322]: 2026-01-26 10:26:56.410 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:26:56 compute-1 nova_compute[226322]: 2026-01-26 10:26:56.597 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:26:56 compute-1 nova_compute[226322]: 2026-01-26 10:26:56.598 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:26:56 compute-1 nova_compute[226322]: 2026-01-26 10:26:56.612 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:26:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:26:57 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3786943580' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:26:57 compute-1 nova_compute[226322]: 2026-01-26 10:26:57.061 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:26:57 compute-1 nova_compute[226322]: 2026-01-26 10:26:57.066 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:26:57 compute-1 nova_compute[226322]: 2026-01-26 10:26:57.093 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:26:57 compute-1 nova_compute[226322]: 2026-01-26 10:26:57.095 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:26:57 compute-1 nova_compute[226322]: 2026-01-26 10:26:57.096 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.685s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:26:57 compute-1 ceph-mon[80107]: pgmap v1348: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 1 op/s
Jan 26 10:26:57 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1505490843' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:26:57 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3786943580' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:26:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:57.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:26:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:57.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:26:57 compute-1 sudo[249433]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:26:57 compute-1 sudo[249433]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:26:57 compute-1 sudo[249433]: pam_unix(sudo:session): session closed for user root
Jan 26 10:26:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:26:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:26:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:26:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:26:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:26:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 10:26:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4254120803' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:26:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 10:26:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4254120803' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:26:58 compute-1 ceph-mon[80107]: pgmap v1349: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Jan 26 10:26:58 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:26:58 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:26:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/4254120803' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:26:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/4254120803' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:26:59 compute-1 podman[249459]: 2026-01-26 10:26:59.311395003 +0000 UTC m=+0.084702646 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 26 10:26:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:26:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:59.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:26:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:26:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:26:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:59.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:27:00 compute-1 ceph-mon[80107]: pgmap v1350: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Jan 26 10:27:00 compute-1 nova_compute[226322]: 2026-01-26 10:27:00.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:27:00 compute-1 nova_compute[226322]: 2026-01-26 10:27:00.928 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:27:00 compute-1 nova_compute[226322]: 2026-01-26 10:27:00.930 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:27:00 compute-1 nova_compute[226322]: 2026-01-26 10:27:00.930 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:27:00 compute-1 nova_compute[226322]: 2026-01-26 10:27:00.930 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:27:00 compute-1 nova_compute[226322]: 2026-01-26 10:27:00.966 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:27:00 compute-1 nova_compute[226322]: 2026-01-26 10:27:00.967 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:27:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:01.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:01.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:02 compute-1 ceph-mon[80107]: pgmap v1351: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 1 op/s
Jan 26 10:27:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:27:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:27:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:27:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:27:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:27:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:03.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:27:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:03.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:27:04 compute-1 ceph-mon[80107]: pgmap v1352: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Jan 26 10:27:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:27:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:05.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:05.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:05 compute-1 nova_compute[226322]: 2026-01-26 10:27:05.967 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:27:05 compute-1 nova_compute[226322]: 2026-01-26 10:27:05.969 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:27:05 compute-1 nova_compute[226322]: 2026-01-26 10:27:05.969 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:27:05 compute-1 nova_compute[226322]: 2026-01-26 10:27:05.969 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:27:06 compute-1 nova_compute[226322]: 2026-01-26 10:27:06.001 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:27:06 compute-1 nova_compute[226322]: 2026-01-26 10:27:06.001 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:27:06 compute-1 ceph-mon[80107]: pgmap v1353: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:27:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:07.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:07.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:27:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:27:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:27:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:27:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:27:08 compute-1 sudo[249490]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:27:08 compute-1 sudo[249490]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:27:08 compute-1 sudo[249490]: pam_unix(sudo:session): session closed for user root
Jan 26 10:27:08 compute-1 ceph-mon[80107]: pgmap v1354: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:27:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:09.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:09.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:10 compute-1 ceph-mon[80107]: pgmap v1355: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:27:11 compute-1 nova_compute[226322]: 2026-01-26 10:27:11.001 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:27:11 compute-1 nova_compute[226322]: 2026-01-26 10:27:11.003 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:27:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:27:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:11.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:27:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:11.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:12 compute-1 ceph-mon[80107]: pgmap v1356: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:27:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:27:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:27:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:27:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:27:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:27:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:13.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:13.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:14 compute-1 ceph-mon[80107]: pgmap v1357: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:27:15 compute-1 podman[249518]: 2026-01-26 10:27:15.264770851 +0000 UTC m=+0.045116904 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 26 10:27:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:15.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:15.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:16 compute-1 nova_compute[226322]: 2026-01-26 10:27:16.004 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:27:16 compute-1 nova_compute[226322]: 2026-01-26 10:27:16.005 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:27:16 compute-1 nova_compute[226322]: 2026-01-26 10:27:16.005 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:27:16 compute-1 nova_compute[226322]: 2026-01-26 10:27:16.006 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:27:16 compute-1 nova_compute[226322]: 2026-01-26 10:27:16.044 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:27:16 compute-1 nova_compute[226322]: 2026-01-26 10:27:16.044 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:27:16 compute-1 ceph-mon[80107]: pgmap v1358: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:27:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:17.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:17.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:17 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:27:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:27:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:27:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:27:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:27:18 compute-1 ceph-mon[80107]: pgmap v1359: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:27:18 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:27:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:19.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:19.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:20 compute-1 ceph-mon[80107]: pgmap v1360: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:27:21 compute-1 nova_compute[226322]: 2026-01-26 10:27:21.045 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:27:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:27:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:21.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:27:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:21.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:27:22 compute-1 ceph-mon[80107]: pgmap v1361: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:27:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:27:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:27:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:27:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:27:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:27:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:23.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:27:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:23.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:24 compute-1 ceph-mon[80107]: pgmap v1362: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:27:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:25.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:25.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:26 compute-1 nova_compute[226322]: 2026-01-26 10:27:26.048 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:27:26 compute-1 nova_compute[226322]: 2026-01-26 10:27:26.049 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:27:26 compute-1 nova_compute[226322]: 2026-01-26 10:27:26.049 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:27:26 compute-1 nova_compute[226322]: 2026-01-26 10:27:26.050 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:27:26 compute-1 nova_compute[226322]: 2026-01-26 10:27:26.089 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:27:26 compute-1 nova_compute[226322]: 2026-01-26 10:27:26.090 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:27:26 compute-1 ceph-mon[80107]: pgmap v1363: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:27:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:27:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:27.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:27:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:27:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:27.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:27:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:27:27 compute-1 ceph-mon[80107]: pgmap v1364: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:27:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:27:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:27:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:27:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:27:28 compute-1 sudo[249544]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:27:28 compute-1 sudo[249544]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:27:28 compute-1 sudo[249544]: pam_unix(sudo:session): session closed for user root
Jan 26 10:27:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:29.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:29.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:30 compute-1 podman[249570]: 2026-01-26 10:27:30.333592658 +0000 UTC m=+0.105664805 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 10:27:30 compute-1 ceph-mon[80107]: pgmap v1365: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:27:31 compute-1 nova_compute[226322]: 2026-01-26 10:27:31.090 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:27:31 compute-1 nova_compute[226322]: 2026-01-26 10:27:31.092 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:27:31 compute-1 ceph-mon[80107]: pgmap v1366: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:27:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:31.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:27:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:31.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:27:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:27:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:27:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:27:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:27:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:27:33 compute-1 ceph-mon[80107]: pgmap v1367: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:27:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:33.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:33.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:27:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:35.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:35 compute-1 ceph-mon[80107]: pgmap v1368: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:27:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:35.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:36 compute-1 nova_compute[226322]: 2026-01-26 10:27:36.093 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:27:36 compute-1 nova_compute[226322]: 2026-01-26 10:27:36.095 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:27:36 compute-1 nova_compute[226322]: 2026-01-26 10:27:36.096 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:27:36 compute-1 nova_compute[226322]: 2026-01-26 10:27:36.096 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:27:36 compute-1 nova_compute[226322]: 2026-01-26 10:27:36.134 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:27:36 compute-1 nova_compute[226322]: 2026-01-26 10:27:36.134 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:27:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:37.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:37.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:37 compute-1 ceph-mon[80107]: pgmap v1369: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:27:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:27:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:27:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:27:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:27:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:27:39 compute-1 ceph-mon[80107]: pgmap v1370: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:27:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:39.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:39.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:41 compute-1 nova_compute[226322]: 2026-01-26 10:27:41.135 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:27:41 compute-1 ceph-mon[80107]: pgmap v1371: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:27:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:41.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:41.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:42 compute-1 nova_compute[226322]: 2026-01-26 10:27:42.697 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:27:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:27:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:27:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:27:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:27:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:27:43 compute-1 ceph-mon[80107]: pgmap v1372: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:27:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:43.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:43.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:43 compute-1 nova_compute[226322]: 2026-01-26 10:27:43.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:27:44 compute-1 nova_compute[226322]: 2026-01-26 10:27:44.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:27:45 compute-1 ceph-mon[80107]: pgmap v1373: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:27:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:45.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:45.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:45 compute-1 nova_compute[226322]: 2026-01-26 10:27:45.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:27:45 compute-1 nova_compute[226322]: 2026-01-26 10:27:45.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:27:46 compute-1 nova_compute[226322]: 2026-01-26 10:27:46.137 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:27:46 compute-1 nova_compute[226322]: 2026-01-26 10:27:46.139 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:27:46 compute-1 nova_compute[226322]: 2026-01-26 10:27:46.140 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:27:46 compute-1 nova_compute[226322]: 2026-01-26 10:27:46.140 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:27:46 compute-1 nova_compute[226322]: 2026-01-26 10:27:46.169 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:27:46 compute-1 nova_compute[226322]: 2026-01-26 10:27:46.170 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:27:46 compute-1 podman[249606]: 2026-01-26 10:27:46.299467852 +0000 UTC m=+0.066775001 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true)
Jan 26 10:27:47 compute-1 ceph-mon[80107]: pgmap v1374: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:27:47 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2695622730' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:27:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:27:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:47.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:27:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:47.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:47 compute-1 nova_compute[226322]: 2026-01-26 10:27:47.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:27:47 compute-1 nova_compute[226322]: 2026-01-26 10:27:47.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:27:47 compute-1 nova_compute[226322]: 2026-01-26 10:27:47.688 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:27:47 compute-1 nova_compute[226322]: 2026-01-26 10:27:47.706 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 10:27:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:27:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:27:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:27:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:27:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:27:48 compute-1 sudo[249626]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:27:48 compute-1 sudo[249626]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:27:48 compute-1 sudo[249626]: pam_unix(sudo:session): session closed for user root
Jan 26 10:27:48 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1215905270' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:27:48 compute-1 nova_compute[226322]: 2026-01-26 10:27:48.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:27:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:27:49 compute-1 ceph-mon[80107]: pgmap v1375: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:27:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:49.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:49.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:49 compute-1 nova_compute[226322]: 2026-01-26 10:27:49.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:27:50 compute-1 nova_compute[226322]: 2026-01-26 10:27:50.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:27:51 compute-1 nova_compute[226322]: 2026-01-26 10:27:51.171 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:27:51 compute-1 ceph-mon[80107]: pgmap v1376: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:27:51 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1866049283' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:27:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:27:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:51.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:27:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:27:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:51.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:27:52 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3376433885' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:27:52 compute-1 nova_compute[226322]: 2026-01-26 10:27:52.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:27:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:27:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:27:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:27:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:27:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:27:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:27:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:53.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:27:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:53.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:53 compute-1 ceph-mon[80107]: pgmap v1377: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:27:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:27:53.955 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:27:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:27:53.955 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:27:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:27:53.955 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:27:55 compute-1 ceph-mon[80107]: pgmap v1378: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:27:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:55.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:55.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:55 compute-1 nova_compute[226322]: 2026-01-26 10:27:55.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:27:55 compute-1 nova_compute[226322]: 2026-01-26 10:27:55.709 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:27:55 compute-1 nova_compute[226322]: 2026-01-26 10:27:55.709 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:27:55 compute-1 nova_compute[226322]: 2026-01-26 10:27:55.710 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:27:55 compute-1 nova_compute[226322]: 2026-01-26 10:27:55.710 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:27:55 compute-1 nova_compute[226322]: 2026-01-26 10:27:55.710 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:27:56 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:27:56 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1747333867' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:27:56 compute-1 nova_compute[226322]: 2026-01-26 10:27:56.161 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:27:56 compute-1 nova_compute[226322]: 2026-01-26 10:27:56.173 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:27:56 compute-1 nova_compute[226322]: 2026-01-26 10:27:56.174 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:27:56 compute-1 nova_compute[226322]: 2026-01-26 10:27:56.174 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:27:56 compute-1 nova_compute[226322]: 2026-01-26 10:27:56.174 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:27:56 compute-1 nova_compute[226322]: 2026-01-26 10:27:56.209 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:27:56 compute-1 nova_compute[226322]: 2026-01-26 10:27:56.210 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:27:56 compute-1 nova_compute[226322]: 2026-01-26 10:27:56.384 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:27:56 compute-1 nova_compute[226322]: 2026-01-26 10:27:56.386 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4852MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:27:56 compute-1 nova_compute[226322]: 2026-01-26 10:27:56.386 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:27:56 compute-1 nova_compute[226322]: 2026-01-26 10:27:56.386 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:27:56 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1747333867' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:27:56 compute-1 nova_compute[226322]: 2026-01-26 10:27:56.522 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:27:56 compute-1 nova_compute[226322]: 2026-01-26 10:27:56.522 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:27:56 compute-1 nova_compute[226322]: 2026-01-26 10:27:56.537 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing inventories for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 26 10:27:56 compute-1 nova_compute[226322]: 2026-01-26 10:27:56.598 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating ProviderTree inventory for provider d06842a0-5d13-4573-bb78-d433bbb380e4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 26 10:27:56 compute-1 nova_compute[226322]: 2026-01-26 10:27:56.599 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating inventory in ProviderTree for provider d06842a0-5d13-4573-bb78-d433bbb380e4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 10:27:56 compute-1 nova_compute[226322]: 2026-01-26 10:27:56.620 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing aggregate associations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 26 10:27:56 compute-1 nova_compute[226322]: 2026-01-26 10:27:56.643 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing trait associations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4, traits: HW_CPU_X86_CLMUL,HW_CPU_X86_SSE,HW_CPU_X86_AVX2,HW_CPU_X86_SHA,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_FMA3,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SVM,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 26 10:27:56 compute-1 nova_compute[226322]: 2026-01-26 10:27:56.659 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:27:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:27:57 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4282693921' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:27:57 compute-1 nova_compute[226322]: 2026-01-26 10:27:57.115 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:27:57 compute-1 nova_compute[226322]: 2026-01-26 10:27:57.120 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:27:57 compute-1 nova_compute[226322]: 2026-01-26 10:27:57.139 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:27:57 compute-1 nova_compute[226322]: 2026-01-26 10:27:57.140 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:27:57 compute-1 nova_compute[226322]: 2026-01-26 10:27:57.140 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.754s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:27:57 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/4282693921' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:27:57 compute-1 ceph-mon[80107]: pgmap v1379: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:27:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:57.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:27:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:57.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:27:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:27:57 compute-1 sudo[249700]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:27:57 compute-1 sudo[249700]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:27:57 compute-1 sudo[249700]: pam_unix(sudo:session): session closed for user root
Jan 26 10:27:57 compute-1 sudo[249725]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:27:57 compute-1 sudo[249725]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:27:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:27:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:27:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:27:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:27:58 compute-1 sudo[249725]: pam_unix(sudo:session): session closed for user root
Jan 26 10:27:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/765544144' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:27:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/765544144' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:27:59 compute-1 ceph-mon[80107]: pgmap v1380: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:27:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:59.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:27:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:27:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:27:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:59.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:01 compute-1 nova_compute[226322]: 2026-01-26 10:28:01.244 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:28:01 compute-1 podman[249782]: 2026-01-26 10:28:01.341603606 +0000 UTC m=+0.084993225 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 26 10:28:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:01.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:01.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:02 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:28:02 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:28:02 compute-1 ceph-mon[80107]: pgmap v1381: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:28:02 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:28:02 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:28:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:28:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:28:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:28:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:28:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:28:03 compute-1 ceph-mon[80107]: pgmap v1382: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:28:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:28:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:28:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:28:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:28:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:28:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:28:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:03.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:28:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:03.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:28:05 compute-1 ceph-mon[80107]: pgmap v1383: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:28:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:05.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:05.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:06 compute-1 nova_compute[226322]: 2026-01-26 10:28:06.249 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:28:06 compute-1 nova_compute[226322]: 2026-01-26 10:28:06.252 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:28:06 compute-1 nova_compute[226322]: 2026-01-26 10:28:06.252 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:28:06 compute-1 nova_compute[226322]: 2026-01-26 10:28:06.252 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:28:06 compute-1 nova_compute[226322]: 2026-01-26 10:28:06.271 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:28:06 compute-1 nova_compute[226322]: 2026-01-26 10:28:06.272 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:28:07 compute-1 sudo[249812]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:28:07 compute-1 sudo[249812]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:28:07 compute-1 sudo[249812]: pam_unix(sudo:session): session closed for user root
Jan 26 10:28:07 compute-1 ceph-mon[80107]: pgmap v1384: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:28:07 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:28:07 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:28:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:07.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:07.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:28:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:28:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:28:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:28:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:28:08 compute-1 ceph-mon[80107]: pgmap v1385: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:28:08 compute-1 sudo[249838]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:28:08 compute-1 sudo[249838]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:28:08 compute-1 sudo[249838]: pam_unix(sudo:session): session closed for user root
Jan 26 10:28:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:09.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:09.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:10 compute-1 ceph-mon[80107]: pgmap v1386: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:28:11 compute-1 nova_compute[226322]: 2026-01-26 10:28:11.273 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:28:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:28:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:11.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:28:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:28:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:11.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:28:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:28:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:28:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:28:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:28:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:28:13 compute-1 ceph-mon[80107]: pgmap v1387: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #79. Immutable memtables: 0.
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.024279) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 47] Flushing memtable with next log file: 79
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423293024309, "job": 47, "event": "flush_started", "num_memtables": 1, "num_entries": 1034, "num_deletes": 251, "total_data_size": 2473634, "memory_usage": 2507120, "flush_reason": "Manual Compaction"}
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 47] Level-0 flush table #80: started
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423293039031, "cf_name": "default", "job": 47, "event": "table_file_creation", "file_number": 80, "file_size": 1584837, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 41059, "largest_seqno": 42088, "table_properties": {"data_size": 1580168, "index_size": 2257, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10334, "raw_average_key_size": 19, "raw_value_size": 1570776, "raw_average_value_size": 3009, "num_data_blocks": 99, "num_entries": 522, "num_filter_entries": 522, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769423216, "oldest_key_time": 1769423216, "file_creation_time": 1769423293, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 80, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 47] Flush lasted 14789 microseconds, and 5192 cpu microseconds.
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.039066) [db/flush_job.cc:967] [default] [JOB 47] Level-0 flush table #80: 1584837 bytes OK
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.039085) [db/memtable_list.cc:519] [default] Level-0 commit table #80 started
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.040830) [db/memtable_list.cc:722] [default] Level-0 commit table #80: memtable #1 done
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.040841) EVENT_LOG_v1 {"time_micros": 1769423293040838, "job": 47, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.040867) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 47] Try to delete WAL files size 2468521, prev total WAL file size 2468521, number of live WAL files 2.
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000076.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.041605) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033323633' seq:72057594037927935, type:22 .. '7061786F730033353135' seq:0, type:0; will stop at (end)
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 48] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 47 Base level 0, inputs: [80(1547KB)], [78(14MB)]
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423293041693, "job": 48, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [80], "files_L6": [78], "score": -1, "input_data_size": 17098012, "oldest_snapshot_seqno": -1}
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 48] Generated table #81: 6925 keys, 14808621 bytes, temperature: kUnknown
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423293108022, "cf_name": "default", "job": 48, "event": "table_file_creation", "file_number": 81, "file_size": 14808621, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14763363, "index_size": 26842, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17349, "raw_key_size": 182522, "raw_average_key_size": 26, "raw_value_size": 14639581, "raw_average_value_size": 2114, "num_data_blocks": 1053, "num_entries": 6925, "num_filter_entries": 6925, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769423293, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 81, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.108246) [db/compaction/compaction_job.cc:1663] [default] [JOB 48] Compacted 1@0 + 1@6 files to L6 => 14808621 bytes
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.110123) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 257.5 rd, 223.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 14.8 +0.0 blob) out(14.1 +0.0 blob), read-write-amplify(20.1) write-amplify(9.3) OK, records in: 7441, records dropped: 516 output_compression: NoCompression
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.110138) EVENT_LOG_v1 {"time_micros": 1769423293110131, "job": 48, "event": "compaction_finished", "compaction_time_micros": 66388, "compaction_time_cpu_micros": 27683, "output_level": 6, "num_output_files": 1, "total_output_size": 14808621, "num_input_records": 7441, "num_output_records": 6925, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000080.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423293110443, "job": 48, "event": "table_file_deletion", "file_number": 80}
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000078.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423293113043, "job": 48, "event": "table_file_deletion", "file_number": 78}
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.041510) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.113136) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.113146) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.113151) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.113155) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:28:13 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.113159) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:28:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:13.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:13.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:15 compute-1 ceph-mon[80107]: pgmap v1388: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:28:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:28:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:15.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:28:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:15.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:16 compute-1 nova_compute[226322]: 2026-01-26 10:28:16.276 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:28:17 compute-1 ceph-mon[80107]: pgmap v1389: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:28:17 compute-1 podman[249867]: 2026-01-26 10:28:17.274827747 +0000 UTC m=+0.057933322 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 10:28:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:17.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:28:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:17.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:28:17 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:28:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:28:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:28:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:28:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:28:19 compute-1 ceph-mon[80107]: pgmap v1390: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:28:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:28:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:19.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:19.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:20 compute-1 ceph-mon[80107]: pgmap v1391: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:28:21 compute-1 nova_compute[226322]: 2026-01-26 10:28:21.278 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:28:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:28:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:21.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:28:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:21.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:28:22 compute-1 ceph-mon[80107]: pgmap v1392: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:28:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:28:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:28:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:28:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:28:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:23.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:23.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:25 compute-1 ceph-mon[80107]: pgmap v1393: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:28:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:25.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:28:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:25.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:28:26 compute-1 nova_compute[226322]: 2026-01-26 10:28:26.280 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:28:27 compute-1 ceph-mon[80107]: pgmap v1394: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:28:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:27.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:27.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:28:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:28:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:28:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:28:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:28:28 compute-1 sudo[249893]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:28:28 compute-1 sudo[249893]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:28:28 compute-1 sudo[249893]: pam_unix(sudo:session): session closed for user root
Jan 26 10:28:29 compute-1 ceph-mon[80107]: pgmap v1395: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:28:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:29.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:29.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:31 compute-1 ceph-mon[80107]: pgmap v1396: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:28:31 compute-1 nova_compute[226322]: 2026-01-26 10:28:31.282 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:28:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:31.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:28:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:31.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:28:32 compute-1 podman[249920]: 2026-01-26 10:28:32.387012259 +0000 UTC m=+0.157645105 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 26 10:28:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:28:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:28:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:28:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:28:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:28:33 compute-1 ceph-mon[80107]: pgmap v1397: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:28:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:28:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:33.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:28:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:28:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:33.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:28:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:28:35 compute-1 ceph-mon[80107]: pgmap v1398: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:28:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:35.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:35.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:36 compute-1 nova_compute[226322]: 2026-01-26 10:28:36.285 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:28:36 compute-1 nova_compute[226322]: 2026-01-26 10:28:36.286 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:28:36 compute-1 nova_compute[226322]: 2026-01-26 10:28:36.287 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:28:36 compute-1 nova_compute[226322]: 2026-01-26 10:28:36.287 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:28:36 compute-1 nova_compute[226322]: 2026-01-26 10:28:36.288 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:28:36 compute-1 nova_compute[226322]: 2026-01-26 10:28:36.290 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:28:37 compute-1 ceph-mon[80107]: pgmap v1399: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:28:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:28:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:37.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:28:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:28:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:37.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:28:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:28:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:28:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:28:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:28:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:28:39 compute-1 ceph-mon[80107]: pgmap v1400: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:28:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:39.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:39.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:41 compute-1 ceph-mon[80107]: pgmap v1401: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:28:41 compute-1 nova_compute[226322]: 2026-01-26 10:28:41.291 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:28:41 compute-1 nova_compute[226322]: 2026-01-26 10:28:41.293 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:28:41 compute-1 nova_compute[226322]: 2026-01-26 10:28:41.293 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:28:41 compute-1 nova_compute[226322]: 2026-01-26 10:28:41.293 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:28:41 compute-1 nova_compute[226322]: 2026-01-26 10:28:41.304 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:28:41 compute-1 nova_compute[226322]: 2026-01-26 10:28:41.304 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:28:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:28:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:41.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:28:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:28:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:41.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:28:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:28:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:28:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:28:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:28:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:28:43 compute-1 ceph-mon[80107]: pgmap v1402: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:28:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:43.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:43.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:44 compute-1 nova_compute[226322]: 2026-01-26 10:28:44.141 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:28:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:45.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:28:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:45.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:28:45 compute-1 nova_compute[226322]: 2026-01-26 10:28:45.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:28:46 compute-1 nova_compute[226322]: 2026-01-26 10:28:46.305 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:28:46 compute-1 ceph-mon[80107]: pgmap v1403: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:28:46 compute-1 nova_compute[226322]: 2026-01-26 10:28:46.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:28:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:47.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:47.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:47 compute-1 ceph-mon[80107]: pgmap v1404: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:28:47 compute-1 nova_compute[226322]: 2026-01-26 10:28:47.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:28:47 compute-1 nova_compute[226322]: 2026-01-26 10:28:47.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:28:47 compute-1 nova_compute[226322]: 2026-01-26 10:28:47.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:28:47 compute-1 nova_compute[226322]: 2026-01-26 10:28:47.700 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 10:28:47 compute-1 nova_compute[226322]: 2026-01-26 10:28:47.700 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:28:47 compute-1 nova_compute[226322]: 2026-01-26 10:28:47.701 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:28:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:28:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:28:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:28:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:28:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:28:48 compute-1 podman[249954]: 2026-01-26 10:28:48.265443942 +0000 UTC m=+0.045536864 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 26 10:28:48 compute-1 sudo[249973]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:28:48 compute-1 ceph-mon[80107]: pgmap v1405: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:28:48 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3262378480' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:28:48 compute-1 sudo[249973]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:28:48 compute-1 sudo[249973]: pam_unix(sudo:session): session closed for user root
Jan 26 10:28:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:28:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:49.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:28:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:49.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:28:49 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/947100076' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:28:49 compute-1 nova_compute[226322]: 2026-01-26 10:28:49.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:28:50 compute-1 nova_compute[226322]: 2026-01-26 10:28:50.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:28:50 compute-1 ceph-mon[80107]: pgmap v1406: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:28:51 compute-1 nova_compute[226322]: 2026-01-26 10:28:51.306 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:28:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:51.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:28:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:51.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:28:51 compute-1 nova_compute[226322]: 2026-01-26 10:28:51.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:28:51 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3768636190' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:28:52 compute-1 ceph-mon[80107]: pgmap v1407: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:28:52 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2897644853' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:28:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:28:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:28:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:28:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:28:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:28:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:28:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:53.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:28:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:28:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:53.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:28:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:28:53.956 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:28:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:28:53.957 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:28:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:28:53.957 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:28:54 compute-1 ceph-mon[80107]: pgmap v1408: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:28:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:55.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:28:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:55.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:28:56 compute-1 nova_compute[226322]: 2026-01-26 10:28:56.308 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:28:56 compute-1 nova_compute[226322]: 2026-01-26 10:28:56.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:28:56 compute-1 nova_compute[226322]: 2026-01-26 10:28:56.715 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:28:56 compute-1 nova_compute[226322]: 2026-01-26 10:28:56.715 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:28:56 compute-1 nova_compute[226322]: 2026-01-26 10:28:56.716 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:28:56 compute-1 nova_compute[226322]: 2026-01-26 10:28:56.716 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:28:56 compute-1 nova_compute[226322]: 2026-01-26 10:28:56.716 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:28:56 compute-1 ceph-mon[80107]: pgmap v1409: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:28:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:28:57 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1175763920' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:28:57 compute-1 nova_compute[226322]: 2026-01-26 10:28:57.209 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:28:57 compute-1 nova_compute[226322]: 2026-01-26 10:28:57.432 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:28:57 compute-1 nova_compute[226322]: 2026-01-26 10:28:57.435 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4837MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:28:57 compute-1 nova_compute[226322]: 2026-01-26 10:28:57.436 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:28:57 compute-1 nova_compute[226322]: 2026-01-26 10:28:57.436 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:28:57 compute-1 nova_compute[226322]: 2026-01-26 10:28:57.519 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:28:57 compute-1 nova_compute[226322]: 2026-01-26 10:28:57.519 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:28:57 compute-1 nova_compute[226322]: 2026-01-26 10:28:57.539 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:28:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:57.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:28:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:28:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:57.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:28:57 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1175763920' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:28:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:28:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:28:57 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2657310510' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:28:57 compute-1 nova_compute[226322]: 2026-01-26 10:28:57.984 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:28:57 compute-1 nova_compute[226322]: 2026-01-26 10:28:57.991 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:28:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:28:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:28:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:28:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:28:58 compute-1 nova_compute[226322]: 2026-01-26 10:28:58.011 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:28:58 compute-1 nova_compute[226322]: 2026-01-26 10:28:58.013 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:28:58 compute-1 nova_compute[226322]: 2026-01-26 10:28:58.013 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.577s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:28:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 10:28:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3037181311' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:28:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 10:28:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3037181311' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:28:58 compute-1 ceph-mon[80107]: pgmap v1410: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:28:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2657310510' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:28:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/3037181311' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:28:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/3037181311' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:28:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:28:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000054s ======
Jan 26 10:28:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:59.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 26 10:28:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:28:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:28:59 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:59.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:00 compute-1 ceph-mon[80107]: pgmap v1411: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:29:01 compute-1 nova_compute[226322]: 2026-01-26 10:29:01.309 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:29:01 compute-1 nova_compute[226322]: 2026-01-26 10:29:01.311 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:29:01 compute-1 nova_compute[226322]: 2026-01-26 10:29:01.311 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:29:01 compute-1 nova_compute[226322]: 2026-01-26 10:29:01.312 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:29:01 compute-1 nova_compute[226322]: 2026-01-26 10:29:01.378 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:29:01 compute-1 nova_compute[226322]: 2026-01-26 10:29:01.379 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:29:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:01.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:01.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:29:02 compute-1 ceph-mon[80107]: pgmap v1412: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:29:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:29:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:29:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:29:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:29:03 compute-1 podman[250049]: 2026-01-26 10:29:03.305325519 +0000 UTC m=+0.085128075 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 26 10:29:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:03.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:29:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:03.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:29:03 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:29:05 compute-1 ceph-mon[80107]: pgmap v1413: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:29:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:05.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:05.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:06 compute-1 nova_compute[226322]: 2026-01-26 10:29:06.380 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:29:07 compute-1 ceph-mon[80107]: pgmap v1414: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:29:07 compute-1 sudo[250077]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 26 10:29:07 compute-1 sudo[250077]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:29:07 compute-1 sudo[250077]: pam_unix(sudo:session): session closed for user root
Jan 26 10:29:07 compute-1 sudo[250102]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 26 10:29:07 compute-1 sudo[250102]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:29:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:29:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:07.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:29:07 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:07 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:07 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:07.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:29:07 compute-1 sudo[250102]: pam_unix(sudo:session): session closed for user root
Jan 26 10:29:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:29:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:29:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:29:08 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:29:08 compute-1 sudo[250160]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:29:08 compute-1 sudo[250160]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:29:08 compute-1 sudo[250160]: pam_unix(sudo:session): session closed for user root
Jan 26 10:29:09 compute-1 ceph-mon[80107]: pgmap v1415: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:29:09 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:29:09 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 10:29:09 compute-1 ceph-mon[80107]: pgmap v1416: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:29:09 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:29:09 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:29:09 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 26 10:29:09 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 10:29:09 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:29:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:09 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:09.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:09 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:29:09 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:09 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:09.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:11 compute-1 ceph-mon[80107]: pgmap v1417: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 1 op/s
Jan 26 10:29:11 compute-1 nova_compute[226322]: 2026-01-26 10:29:11.382 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:29:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:29:11 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:29:11 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:11.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:29:11 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:11 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:11.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:12 compute-1 sudo[250187]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 26 10:29:12 compute-1 sudo[250187]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:29:12 compute-1 sudo[250187]: pam_unix(sudo:session): session closed for user root
Jan 26 10:29:12 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:29:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:29:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:29:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:29:13 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:29:13 compute-1 ceph-mon[80107]: pgmap v1418: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:29:13 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:29:13 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 10:29:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:13 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:29:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:13 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:13.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:13 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:29:13 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:13.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:29:15 compute-1 ceph-mon[80107]: pgmap v1419: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:29:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:29:15 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:15 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:15.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:15 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:29:15 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:15.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:29:16 compute-1 nova_compute[226322]: 2026-01-26 10:29:16.385 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:29:17 compute-1 ceph-mon[80107]: pgmap v1420: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:29:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:17 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:29:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:29:17 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:29:17 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:17.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:29:17 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:17.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:29:17 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:29:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:29:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:29:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:29:18 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:29:19 compute-1 ceph-mon[80107]: pgmap v1421: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 26 10:29:19 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:29:19 compute-1 podman[250216]: 2026-01-26 10:29:19.288429611 +0000 UTC m=+0.065764556 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 26 10:29:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:19 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:29:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:19 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:19.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:19 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:19 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:19.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:21 compute-1 ceph-mon[80107]: pgmap v1422: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:29:21 compute-1 nova_compute[226322]: 2026-01-26 10:29:21.386 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:29:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:21 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:29:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:29:21 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:21.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:29:21 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:21 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:21.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:22 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:29:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:29:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:29:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:29:23 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:29:23 compute-1 ceph-mon[80107]: pgmap v1423: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:29:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:29:23 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:23 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:23 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:23.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:23 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:23.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:25 compute-1 ceph-mon[80107]: pgmap v1424: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:29:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:25 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:29:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:29:25 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:29:25 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:25.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:29:25 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:25.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:29:26 compute-1 nova_compute[226322]: 2026-01-26 10:29:26.388 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:29:27 compute-1 ceph-mon[80107]: pgmap v1425: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:29:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:27 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 10:29:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:27 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:27.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:27 compute-1 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:29:27 compute-1 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:27.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:29:27 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:29:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:29:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:29:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:29:28 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:29:28 compute-1 sudo[250241]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:29:28 compute-1 sudo[250241]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:29:28 compute-1 sudo[250241]: pam_unix(sudo:session): session closed for user root
Jan 26 10:29:29 compute-1 ceph-mon[80107]: pgmap v1426: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:29:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:29.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:29 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:29 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:29 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:29.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:31 compute-1 ceph-mon[80107]: pgmap v1427: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:29:31 compute-1 nova_compute[226322]: 2026-01-26 10:29:31.390 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:29:31 compute-1 nova_compute[226322]: 2026-01-26 10:29:31.391 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:29:31 compute-1 nova_compute[226322]: 2026-01-26 10:29:31.391 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:29:31 compute-1 nova_compute[226322]: 2026-01-26 10:29:31.392 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:29:31 compute-1 nova_compute[226322]: 2026-01-26 10:29:31.426 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:29:31 compute-1 nova_compute[226322]: 2026-01-26 10:29:31.427 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:29:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:29:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:31.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:29:31 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:31 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:31 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:31.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:32 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:29:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:29:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:29:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:29:33 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:29:33 compute-1 ceph-mon[80107]: pgmap v1428: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:29:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:33.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:33 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:33 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:33 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:33.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:34 compute-1 podman[250269]: 2026-01-26 10:29:34.357422892 +0000 UTC m=+0.139612813 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 10:29:34 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:29:35 compute-1 ceph-mon[80107]: pgmap v1429: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:29:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:35.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:35 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:35 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:29:35 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:35.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:29:36 compute-1 nova_compute[226322]: 2026-01-26 10:29:36.428 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:29:36 compute-1 nova_compute[226322]: 2026-01-26 10:29:36.429 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:29:36 compute-1 nova_compute[226322]: 2026-01-26 10:29:36.429 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:29:36 compute-1 nova_compute[226322]: 2026-01-26 10:29:36.429 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:29:36 compute-1 nova_compute[226322]: 2026-01-26 10:29:36.430 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:29:36 compute-1 nova_compute[226322]: 2026-01-26 10:29:36.432 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:29:37 compute-1 ceph-mon[80107]: pgmap v1430: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:29:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:37.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:37 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:37 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:37 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:37.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:37 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:29:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:29:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:29:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:29:38 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:29:39 compute-1 sshd-session[250297]: Accepted publickey for zuul from 192.168.122.10 port 57356 ssh2: ECDSA SHA256:3+mD6W9podl8Ei5P+Dtw+049tIr7OsvnVW8okhUeQyk
Jan 26 10:29:39 compute-1 systemd-logind[783]: New session 58 of user zuul.
Jan 26 10:29:39 compute-1 systemd[1]: Started Session 58 of User zuul.
Jan 26 10:29:39 compute-1 sshd-session[250297]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 10:29:39 compute-1 sudo[250301]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Jan 26 10:29:39 compute-1 sudo[250301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 10:29:39 compute-1 ceph-mon[80107]: pgmap v1431: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:29:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:39.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:39 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:39 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:39 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:39.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:41 compute-1 nova_compute[226322]: 2026-01-26 10:29:41.432 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:29:41 compute-1 ceph-mon[80107]: pgmap v1432: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:29:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:41.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:41 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:41 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:29:41 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:41.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:29:42 compute-1 ceph-mon[80107]: from='client.17913 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:42 compute-1 ceph-mon[80107]: from='client.27683 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:42 compute-1 ceph-mon[80107]: from='client.27307 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:42 compute-1 ceph-mon[80107]: from='client.17922 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:42 compute-1 ceph-mon[80107]: pgmap v1433: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:29:42 compute-1 ceph-mon[80107]: from='client.27692 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:42 compute-1 ceph-mon[80107]: from='client.27322 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:42 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/304949107' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 26 10:29:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Jan 26 10:29:42 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2606455545' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 26 10:29:42 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:29:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:29:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:29:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:29:43 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:29:43 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2606455545' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 26 10:29:43 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1179573448' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 26 10:29:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:43.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:43 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:43 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:29:43 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:43.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:29:44 compute-1 ceph-mon[80107]: pgmap v1434: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:29:45 compute-1 nova_compute[226322]: 2026-01-26 10:29:45.014 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:29:45 compute-1 ovs-vsctl[250625]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 26 10:29:45 compute-1 nova_compute[226322]: 2026-01-26 10:29:45.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:29:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:45.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:45 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:45 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:45 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:45.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:46 compute-1 nova_compute[226322]: 2026-01-26 10:29:46.433 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:29:46 compute-1 virtqemud[225791]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 26 10:29:46 compute-1 ceph-mon[80107]: pgmap v1435: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:29:46 compute-1 virtqemud[225791]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 26 10:29:46 compute-1 virtqemud[225791]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 26 10:29:46 compute-1 nova_compute[226322]: 2026-01-26 10:29:46.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:29:47 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: cache status {prefix=cache status} (starting...)
Jan 26 10:29:47 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 10:29:47 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: client ls {prefix=client ls} (starting...)
Jan 26 10:29:47 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 10:29:47 compute-1 lvm[250980]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 26 10:29:47 compute-1 lvm[250980]: VG ceph_vg0 finished
Jan 26 10:29:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:47.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:47 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:47 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:47 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:47.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:47 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: damage ls {prefix=damage ls} (starting...)
Jan 26 10:29:47 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 10:29:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:29:47 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: dump loads {prefix=dump loads} (starting...)
Jan 26 10:29:47 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 10:29:47 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Jan 26 10:29:47 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1597042348' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 26 10:29:47 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Jan 26 10:29:47 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 10:29:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:29:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:29:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:29:48 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:29:48 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1597042348' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 26 10:29:48 compute-1 ceph-mon[80107]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 26 10:29:48 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2963176918' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 26 10:29:48 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Jan 26 10:29:48 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 10:29:48 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Jan 26 10:29:48 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 10:29:48 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Jan 26 10:29:48 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3490216551' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:29:48 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Jan 26 10:29:48 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 10:29:48 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Jan 26 10:29:48 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 10:29:48 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Jan 26 10:29:48 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3430578706' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 26 10:29:48 compute-1 nova_compute[226322]: 2026-01-26 10:29:48.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:29:48 compute-1 nova_compute[226322]: 2026-01-26 10:29:48.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 10:29:48 compute-1 nova_compute[226322]: 2026-01-26 10:29:48.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 10:29:48 compute-1 nova_compute[226322]: 2026-01-26 10:29:48.711 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 10:29:48 compute-1 nova_compute[226322]: 2026-01-26 10:29:48.711 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:29:48 compute-1 nova_compute[226322]: 2026-01-26 10:29:48.712 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 10:29:48 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: get subtrees {prefix=get subtrees} (starting...)
Jan 26 10:29:48 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 10:29:48 compute-1 sudo[251226]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 26 10:29:48 compute-1 sudo[251226]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 26 10:29:48 compute-1 sudo[251226]: pam_unix(sudo:session): session closed for user root
Jan 26 10:29:49 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: ops {prefix=ops} (starting...)
Jan 26 10:29:49 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 10:29:49 compute-1 ceph-mon[80107]: from='client.27716 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:49 compute-1 ceph-mon[80107]: from='client.17943 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:49 compute-1 ceph-mon[80107]: from='client.27728 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:49 compute-1 ceph-mon[80107]: from='client.17958 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:49 compute-1 ceph-mon[80107]: pgmap v1436: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:29:49 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3490216551' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:29:49 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/62027242' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:29:49 compute-1 ceph-mon[80107]: from='client.27740 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:49 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3430578706' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 26 10:29:49 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/513814966' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 26 10:29:49 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:29:49 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Jan 26 10:29:49 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3278578878' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 26 10:29:49 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: session ls {prefix=session ls} (starting...)
Jan 26 10:29:49 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 10:29:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:49.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:49 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:49 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 10:29:49 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:49.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 10:29:49 compute-1 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: status {prefix=status} (starting...)
Jan 26 10:29:49 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 26 10:29:49 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4111243696' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 26 10:29:50 compute-1 ceph-mon[80107]: from='client.17970 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:50 compute-1 ceph-mon[80107]: from='client.27346 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:50 compute-1 ceph-mon[80107]: from='client.27755 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:50 compute-1 ceph-mon[80107]: from='client.17985 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:50 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3446623633' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 26 10:29:50 compute-1 ceph-mon[80107]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 26 10:29:50 compute-1 ceph-mon[80107]: from='client.27373 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:50 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1223797148' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 26 10:29:50 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3278578878' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 26 10:29:50 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2403931818' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 26 10:29:50 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1209062128' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 26 10:29:50 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3162243807' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 10:29:50 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/70919432' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:29:50 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3572298696' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 26 10:29:50 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/926128577' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 26 10:29:50 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/4111243696' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 26 10:29:50 compute-1 podman[251416]: 2026-01-26 10:29:50.268476828 +0000 UTC m=+0.050871920 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 26 10:29:50 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 26 10:29:50 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/266995504' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 26 10:29:50 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Jan 26 10:29:50 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1646311646' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 26 10:29:50 compute-1 nova_compute[226322]: 2026-01-26 10:29:50.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:29:50 compute-1 nova_compute[226322]: 2026-01-26 10:29:50.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:29:50 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 26 10:29:50 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2590876303' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 26 10:29:50 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Jan 26 10:29:50 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2610141522' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 26 10:29:51 compute-1 ceph-mon[80107]: from='client.27394 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:51 compute-1 ceph-mon[80107]: from='client.18021 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:51 compute-1 ceph-mon[80107]: from='client.27788 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:51 compute-1 ceph-mon[80107]: from='client.27409 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:51 compute-1 ceph-mon[80107]: from='client.27418 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:51 compute-1 ceph-mon[80107]: from='client.18042 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:51 compute-1 ceph-mon[80107]: pgmap v1437: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:29:51 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2683446461' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:29:51 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1733620249' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 26 10:29:51 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/266995504' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 26 10:29:51 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2091732254' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 26 10:29:51 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1765323621' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 26 10:29:51 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1646311646' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 26 10:29:51 compute-1 ceph-mon[80107]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 26 10:29:51 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2908729226' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 26 10:29:51 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2194018511' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 26 10:29:51 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2590876303' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 26 10:29:51 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1140252076' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 26 10:29:51 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2610141522' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 26 10:29:51 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1332807695' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 26 10:29:51 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 26 10:29:51 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/965486565' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 26 10:29:51 compute-1 nova_compute[226322]: 2026-01-26 10:29:51.435 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:29:51 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Jan 26 10:29:51 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2903462977' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 26 10:29:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:51.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:51 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:51 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:51 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:51.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:51 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Jan 26 10:29:51 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2842339571' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 26 10:29:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 26 10:29:52 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1562471956' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 26 10:29:52 compute-1 ceph-mon[80107]: from='client.27454 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:52 compute-1 ceph-mon[80107]: from='client.27484 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:52 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2634675257' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 26 10:29:52 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/965486565' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 26 10:29:52 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2839110286' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 26 10:29:52 compute-1 ceph-mon[80107]: from='client.27881 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:52 compute-1 ceph-mon[80107]: from='client.18114 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:52 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2870307844' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 26 10:29:52 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3561081167' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 26 10:29:52 compute-1 ceph-mon[80107]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 26 10:29:52 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2903462977' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 26 10:29:52 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2080751942' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 26 10:29:52 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2842339571' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 26 10:29:52 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3034132189' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 26 10:29:52 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3117585736' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 26 10:29:52 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1562471956' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 26 10:29:52 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/27549982' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 26 10:29:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Jan 26 10:29:52 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3441896431' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 26 10:29:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 26 10:29:52 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/657819002' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 26 10:29:52 compute-1 nova_compute[226322]: 2026-01-26 10:29:52.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:29:52 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:29:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:29:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:29:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:29:53 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:06.399090+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:07.399273+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:08.399508+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:09.399741+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:10.399957+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:11.400149+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:12.400326+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:13.400545+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:14.400746+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:15.400921+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:16.401089+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:17.401279+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:18.401482+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:19.401656+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:20.401832+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:21.401996+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:22.402186+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:23.402365+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:24.402543+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:25.402763+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:26.402969+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d170b6b400 session 0x55d1721d3a40
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172162400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:27.403125+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:28.403405+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:29.403584+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:30.403771+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:31.403930+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:32.404077+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:33.404199+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:34.404351+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77627392 unmapped: 892928 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:35.404498+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77627392 unmapped: 892928 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:36.404773+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 74.881858826s of 74.889221191s, submitted: 2
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77643776 unmapped: 876544 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:37.404938+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77651968 unmapped: 868352 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:38.405062+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77651968 unmapped: 868352 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:39.405214+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17042a000 session 0x55d1720950e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:40.405906+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936906 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:41.406076+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:42.406268+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:43.406447+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:44.406599+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:45.406775+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936906 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:46.406955+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:47.407100+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:48.407257+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:49.407407+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:50.407589+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936906 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:51.407749+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:52.407879+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:53.408047+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:54.408167+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.996082306s of 18.000623703s, submitted: 1
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:55.408267+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938418 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:56.408435+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:57.408563+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:58.408741+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:57:59.408908+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:00.409049+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938418 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:01.409163+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:02.409298+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:03.409475+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:04.409669+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:05.409814+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938418 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:06.410089+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:07.410245+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:08.410430+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:09.410612+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.481588364s of 14.488451958s, submitted: 1
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77701120 unmapped: 819200 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:10.411328+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77709312 unmapped: 811008 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939339 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:11.411519+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77709312 unmapped: 811008 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:12.411746+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77709312 unmapped: 811008 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:13.411916+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77709312 unmapped: 811008 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:14.412301+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:15.412416+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77709312 unmapped: 811008 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:16.412596+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 794624 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:17.412802+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 794624 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:18.412920+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 794624 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:19.413075+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 794624 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:20.413231+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:21.413378+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:22.413559+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:23.413717+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:24.413869+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:25.414014+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:26.414236+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:27.414428+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:28.414595+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:29.414771+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:30.414914+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:31.415099+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:32.415256+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:33.415412+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:34.415546+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:35.415770+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:36.415924+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:37.416087+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:38.416229+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:39.416405+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:40.416620+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:41.416877+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:42.417047+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:43.417402+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:44.417536+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:45.417679+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:46.417884+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:47.418058+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:48.418887+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:49.419050+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17042a800 session 0x55d172095c20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:50.419287+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17035d000 session 0x55d17260d0e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172164400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:51.419520+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:52.419815+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:53.420029+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:54.420194+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:55.420340+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:56.421003+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:57.421136+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:58.421321+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 729088 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:58:59.421563+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 729088 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:00.421725+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:01.421886+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:02.422063+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:03.422273+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:04.422467+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:05.422651+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:06.422946+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17035d000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 56.857223511s of 56.950168610s, submitted: 4
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:07.423142+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:08.423273+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:09.423401+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:10.423540+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:11.423825+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:12.424019+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:13.424333+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:14.424776+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:15.425546+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:16.427011+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:17.427191+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:18.427399+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:19.427519+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 696320 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:20.427767+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77840384 unmapped: 679936 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:21.428854+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77840384 unmapped: 679936 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:22.429004+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77840384 unmapped: 679936 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:23.429191+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:24.429318+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:25.429459+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:26.429657+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:27.429810+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:28.429966+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:29.430089+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:30.430231+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:31.430401+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:32.430610+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:33.430786+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:34.430929+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:35.431097+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:36.431280+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:37.431426+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:38.431637+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:39.431764+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:40.431909+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:41.432046+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:42.432180+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:43.432423+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:44.432586+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:45.432723+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:46.432867+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:47.433007+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:48.433128+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17035d000 session 0x55d172e81860
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:49.433272+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:50.433463+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:51.433592+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 6838 writes, 27K keys, 6838 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 6838 writes, 1330 syncs, 5.14 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 522 writes, 785 keys, 522 commit groups, 1.0 writes per commit group, ingest: 0.25 MB, 0.00 MB/s
                                           Interval WAL: 522 writes, 261 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:52.433772+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/352923986' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 26 10:29:53 compute-1 ceph-mon[80107]: pgmap v1438: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:29:53 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3441896431' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 26 10:29:53 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2923571804' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 26 10:29:53 compute-1 ceph-mon[80107]: from='client.18147 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:53 compute-1 ceph-mon[80107]: from='client.27550 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:53 compute-1 ceph-mon[80107]: from='client.27926 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:53 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2655571253' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 26 10:29:53 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/4016153586' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 26 10:29:53 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/657819002' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 26 10:29:53 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2740923979' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 26 10:29:53 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/4066770674' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:53.433978+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:54.434341+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:55.434533+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:56.434847+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:57.434979+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:58.435336+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T09:59:59.435477+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77881344 unmapped: 638976 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:00.435868+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77881344 unmapped: 638976 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:01.436000+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77881344 unmapped: 638976 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:02.436137+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77881344 unmapped: 638976 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:03.436267+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77881344 unmapped: 638976 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:04.436390+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77889536 unmapped: 630784 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254c000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 58.666145325s of 58.674335480s, submitted: 2
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:05.436531+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77889536 unmapped: 630784 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:06.436766+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941181 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77889536 unmapped: 630784 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:07.436881+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77889536 unmapped: 630784 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:08.436991+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77889536 unmapped: 630784 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread fragmentation_score=0.000026 took=0.000118s
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:09.437161+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:10.437320+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:11.437455+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:12.437741+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:13.437938+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:14.438133+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:15.438387+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:16.438602+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:17.438785+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:18.438917+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:19.439082+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:20.439207+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:21.439327+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:22.439550+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:23.439721+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:24.439888+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:25.440022+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:26.440137+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:27.440291+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:28.440669+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:29.440800+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:30.440944+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:31.441068+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:32.441257+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:33.441383+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:34.441514+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:35.441657+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77930496 unmapped: 589824 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:36.441881+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77930496 unmapped: 589824 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:37.442038+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77930496 unmapped: 589824 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:38.442157+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77930496 unmapped: 589824 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:39.442286+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:40.442420+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:41.442537+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d171202800 session 0x55d17260c1e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:42.442671+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:43.442776+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:44.442909+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:45.443060+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:46.443271+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:47.443433+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:48.443591+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:49.443757+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:50.443901+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:51.444077+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:52.444261+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:53.444396+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:54.444528+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:55.444778+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:56.444993+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:57.445158+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d172163800 session 0x55d16feeed20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:58.445342+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:00:59.445572+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:00.445772+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:01.445927+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 56.543655396s of 56.563098907s, submitted: 2
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939999 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:02.446072+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:03.446231+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:04.446410+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:05.446545+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:06.446765+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939999 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:07.446944+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:08.447108+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:09.447257+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:10.447489+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:11.447617+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939999 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.324966431s of 10.332220078s, submitted: 1
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:12.447762+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:13.447885+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:14.448020+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:15.448211+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:16.448417+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:17.448570+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:18.448763+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:19.448888+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77979648 unmapped: 540672 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:20.449055+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77979648 unmapped: 540672 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:21.449222+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:22.449378+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:23.449548+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:24.449755+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:25.449918+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:26.450179+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:27.450335+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:28.450496+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:29.450680+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:30.450852+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:31.450994+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:32.451143+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:33.451290+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:34.451483+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:35.451593+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:36.451685+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77996032 unmapped: 524288 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:37.451812+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77996032 unmapped: 524288 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:38.451955+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77996032 unmapped: 524288 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:39.452067+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:40.452189+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:41.452343+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:42.452488+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:43.452609+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:44.452759+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:45.452962+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:46.453211+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:47.453365+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:48.453501+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 36.846134186s of 36.849975586s, submitted: 1
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78102528 unmapped: 417792 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:49.453613+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78290944 unmapped: 229376 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:50.453774+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78323712 unmapped: 196608 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:51.453896+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78323712 unmapped: 196608 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:52.454020+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78323712 unmapped: 196608 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:53.454190+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78323712 unmapped: 196608 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:54.454436+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78323712 unmapped: 196608 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:55.454591+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78331904 unmapped: 188416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:56.454796+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78331904 unmapped: 188416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:57.454950+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78331904 unmapped: 188416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:58.455096+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78331904 unmapped: 188416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:01:59.455282+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78331904 unmapped: 188416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:00.455502+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78331904 unmapped: 188416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:01.455691+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:02.455883+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:03.456069+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:04.456243+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:05.456384+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:06.456564+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:07.456732+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:08.456893+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:09.457061+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:10.457196+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:11.457364+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:12.457506+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:13.457629+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:14.457812+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:15.457955+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:16.458140+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:17.458291+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:18.458428+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:19.458572+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:20.458827+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:21.459155+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:22.459380+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:23.459576+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:24.459743+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78356480 unmapped: 163840 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:25.459883+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78356480 unmapped: 163840 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:26.460074+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78356480 unmapped: 163840 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:27.460550+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 37.952041626s of 38.464164734s, submitted: 139
Jan 26 10:29:53 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 26 10:29:53 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2923609229' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78381056 unmapped: 1187840 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:28.460685+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:29.461524+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:30.461675+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:31.462302+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:32.462511+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:33.462789+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:34.462905+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:35.463055+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:36.463224+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78422016 unmapped: 1146880 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:37.463378+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78422016 unmapped: 1146880 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:38.463536+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 1138688 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:39.463775+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 1138688 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:40.463934+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 1138688 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:41.464161+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 1138688 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:42.464294+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 1138688 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:43.464441+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 1138688 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:44.464591+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:45.464777+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:46.464945+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:47.465065+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:48.465193+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:49.465353+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:50.465479+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:51.465692+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:52.465864+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:53.466000+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:54.466156+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:55.466308+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:56.466458+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:57.466617+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:58.466760+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:02:59.466890+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:00.467039+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:01.467203+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:02.467349+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:03.467487+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:04.467636+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:05.467818+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:06.468033+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:07.468345+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:08.468593+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:09.468798+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:10.469047+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:11.469363+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:12.469580+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:13.469913+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:14.470096+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:15.470353+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:16.470543+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:17.470719+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:18.470906+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:19.471230+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:20.471407+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 52.869400024s of 53.368377686s, submitted: 86
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:21.471569+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:22.471798+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943023 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:23.472005+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:24.472174+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:25.472388+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:26.472623+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:27.472758+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943023 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:28.472904+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:29.473103+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:30.473275+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:31.473436+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:32.473578+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943023 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.398575783s of 12.402492523s, submitted: 1
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:33.473748+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:34.473919+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:35.474120+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17035d000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:36.474318+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17254c000 session 0x55d17293a5a0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:37.474516+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943944 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:38.474668+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:39.474862+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:40.475044+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:41.475254+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:42.475397+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943944 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:43.475554+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:44.475750+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:45.475868+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:46.476042+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:47.476168+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943944 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:48.476328+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:49.476485+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:50.476645+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:51.476832+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:52.476967+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943944 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:53.477141+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.989221573s of 20.182910919s, submitted: 2
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:54.477280+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:55.477439+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:56.477604+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:57.477751+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 945456 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:58.477902+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:03:59.478039+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:00.478177+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:01.478306+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:02.478523+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:03.478684+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:04.478867+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:05.479002+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:06.479194+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:07.479367+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:08.479541+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:09.479792+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:10.479944+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:11.480086+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:12.480391+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:13.481667+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:14.481977+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:15.482757+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:16.483065+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:17.483563+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:18.483865+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:19.484041+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:20.484275+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:21.484534+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:22.484748+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:23.484938+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:24.485095+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:25.485242+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:26.485487+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:27.485947+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:28.486179+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:29.486507+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:30.486755+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:31.486991+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:32.487158+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:33.487298+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:34.487480+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:35.487747+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:36.487988+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:37.488262+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:38.488424+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:39.488578+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:40.488736+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:41.488886+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:42.489031+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:43.489175+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:44.489316+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:45.489884+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:46.490246+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:47.490882+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:48.491226+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 1097728 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:49.491418+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 1097728 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:50.491797+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 1097728 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:51.492298+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17035d000 session 0x55d16feee1e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 1097728 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:52.492546+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 1097728 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:53.492869+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 1097728 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:54.493138+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78479360 unmapped: 1089536 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:55.493300+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78479360 unmapped: 1089536 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:56.493465+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78479360 unmapped: 1089536 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:57.493774+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78479360 unmapped: 1089536 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:58.494008+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78479360 unmapped: 1089536 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:04:59.494216+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:00.495160+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:01.495408+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:02.495658+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:03.495849+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:04.495998+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:05.496206+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:06.496523+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:07.496820+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:08.497071+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:09.497321+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:10.497603+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:11.497871+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:12.498044+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 handle_osd_map epochs [144,144], i have 143, src has [1,144]
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 79.554496765s of 79.564147949s, submitted: 3
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _renew_subs
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 143 handle_osd_map epochs [144,144], i have 144, src has [1,144]
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78536704 unmapped: 1032192 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:13.498180+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78536704 unmapped: 1032192 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 144 handle_osd_map epochs [144,145], i have 144, src has [1,145]
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:14.498373+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 79724544 unmapped: 16629760 heap: 96354304 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:15.498519+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 146 ms_handle_reset con 0x55d171202800 session 0x55d17263d4a0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042ac00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80936960 unmapped: 23814144 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:16.498740+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 146 heartbeat osd_stat(store_statfs(0x4fb58a000/0x0/0x4ffc00000, data 0x15d8375/0x1691000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80936960 unmapped: 23814144 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:17.498898+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _renew_subs
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 146 handle_osd_map epochs [147,147], i have 146, src has [1,147]
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 147 ms_handle_reset con 0x55d17042ac00 session 0x55d172584d20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1101069 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:18.499051+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:19.499199+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:20.499397+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 147 heartbeat osd_stat(store_statfs(0x4fb585000/0x0/0x4ffc00000, data 0x15da4a0/0x1695000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 147 handle_osd_map epochs [148,148], i have 147, src has [1,148]
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 147 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:21.499536+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:22.499643+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:23.499779+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:24.499930+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:25.500065+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:26.500280+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:27.500436+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:28.500593+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:29.500764+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:30.500901+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:31.501038+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:32.501197+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:33.514913+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:34.515077+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:35.515243+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:36.515435+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:37.515576+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80961536 unmapped: 23789568 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:38.515740+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80961536 unmapped: 23789568 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:39.515910+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:40.516060+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:41.516202+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:42.516317+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:43.516472+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:44.516552+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:45.516684+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:46.516860+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:47.516981+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:48.517117+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:49.517256+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:50.517410+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:51.517553+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:52.517677+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:53.517808+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 40.111835480s of 40.450466156s, submitted: 51
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:54.518038+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:55.518206+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:56.518410+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:57.518563+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1104407 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:58.518783+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:05:59.518921+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:00.519085+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb584000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:01.519226+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80986112 unmapped: 23764992 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:02.519383+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80986112 unmapped: 23764992 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102976 data_alloc: 218103808 data_used: 176128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:03.519591+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80986112 unmapped: 23764992 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172151000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 148 ms_handle_reset con 0x55d172151000 session 0x55d172095e00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17035d000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 148 ms_handle_reset con 0x55d17035d000 session 0x55d172672960
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:04.519761+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 92413952 unmapped: 12337152 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042ac00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb584000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:05.519919+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 92504064 unmapped: 12247040 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 148 handle_osd_map epochs [148,149], i have 148, src has [1,149]
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.857666016s of 12.088842392s, submitted: 2
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:06.520212+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 92504064 unmapped: 12247040 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _renew_subs
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 149 handle_osd_map epochs [150,150], i have 149, src has [1,150]
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:07.520413+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94339072 unmapped: 10412032 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 150 heartbeat osd_stat(store_statfs(0x4fb57c000/0x0/0x4ffc00000, data 0x15e06b1/0x169f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 150 ms_handle_reset con 0x55d17042ac00 session 0x55d17019b680
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 150 ms_handle_reset con 0x55d171202800 session 0x55d172671680
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254c000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 150 ms_handle_reset con 0x55d17254c000 session 0x55d172671860
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172153400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1167114 data_alloc: 234881024 data_used: 11649024
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:08.520595+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94339072 unmapped: 10412032 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 150 ms_handle_reset con 0x55d172153400 session 0x55d172671a40
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17035d000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 150 ms_handle_reset con 0x55d17035d000 session 0x55d172671c20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:09.520772+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94339072 unmapped: 10412032 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:10.520928+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94339072 unmapped: 10412032 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 150 heartbeat osd_stat(store_statfs(0x4fb234000/0x0/0x4ffc00000, data 0x19286b1/0x19e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:11.521067+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94339072 unmapped: 10412032 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:12.521176+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94339072 unmapped: 10412032 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042ac00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 150 ms_handle_reset con 0x55d17042ac00 session 0x55d1726714a0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 150 handle_osd_map epochs [150,151], i have 150, src has [1,151]
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170865 data_alloc: 234881024 data_used: 11649024
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254c000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:13.521281+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94191616 unmapped: 10559488 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a000 session 0x55d170188f00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:14.521385+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97239040 unmapped: 7512064 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fb231000/0x0/0x4ffc00000, data 0x192a683/0x19ea000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:15.521547+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:16.521725+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:17.521846+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192885 data_alloc: 234881024 data_used: 14888960
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:18.522016+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fb231000/0x0/0x4ffc00000, data 0x192a683/0x19ea000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:19.522143+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:20.522285+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:21.522446+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fb231000/0x0/0x4ffc00000, data 0x192a683/0x19ea000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:22.522555+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192885 data_alloc: 234881024 data_used: 14888960
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:23.522669+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:24.522825+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.199676514s of 19.449176788s, submitted: 39
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:25.522952+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102621184 unmapped: 4235264 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:26.523116+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102588416 unmapped: 4268032 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa917000/0x0/0x4ffc00000, data 0x2245683/0x2305000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:27.523232+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102457344 unmapped: 4399104 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1274279 data_alloc: 234881024 data_used: 15114240
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:28.523345+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102457344 unmapped: 4399104 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa876000/0x0/0x4ffc00000, data 0x22e6683/0x23a6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:29.523464+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102457344 unmapped: 4399104 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:30.523562+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 4210688 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:31.523776+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 4210688 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:32.523915+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102244352 unmapped: 4612096 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1272359 data_alloc: 234881024 data_used: 15114240
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:33.524044+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102244352 unmapped: 4612096 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:34.524385+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa852000/0x0/0x4ffc00000, data 0x230a683/0x23ca000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102244352 unmapped: 4612096 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:35.524538+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102244352 unmapped: 4612096 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:36.524741+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102244352 unmapped: 4612096 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:37.524894+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102244352 unmapped: 4612096 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.935168266s of 12.713699341s, submitted: 84
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1272375 data_alloc: 234881024 data_used: 15114240
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:38.525059+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:39.525224+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:40.525379+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa847000/0x0/0x4ffc00000, data 0x2315683/0x23d5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:41.525540+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:42.525694+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa847000/0x0/0x4ffc00000, data 0x2315683/0x23d5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271784 data_alloc: 234881024 data_used: 15114240
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:43.525858+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:44.525994+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:45.526134+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102096896 unmapped: 4759552 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17250dc00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:46.526278+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102096896 unmapped: 4759552 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa844000/0x0/0x4ffc00000, data 0x2318683/0x23d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:47.526445+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d172672d20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172994000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172994000 session 0x55d172896780
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17035d000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17035d000 session 0x55d17019e000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102088704 unmapped: 4767744 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a000 session 0x55d1725c54a0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042ac00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1273484 data_alloc: 234881024 data_used: 15114240
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:48.526586+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042ac00 session 0x55d1725c5680
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103366656 unmapped: 3489792 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa844000/0x0/0x4ffc00000, data 0x2318683/0x23d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:49.526736+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d17266eb40
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172096800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.664448738s of 11.864383698s, submitted: 7
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103407616 unmapped: 9748480 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d170d8cd20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17035d000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17035d000 session 0x55d1726721e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a000 session 0x55d17034ef00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042ac00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042ac00 session 0x55d1703c14a0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d17289b680
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa844000/0x0/0x4ffc00000, data 0x2318683/0x23d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:50.526897+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103424000 unmapped: 9732096 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa2b2000/0x0/0x4ffc00000, data 0x28aa683/0x296a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:51.527128+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103424000 unmapped: 9732096 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:52.527287+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103424000 unmapped: 9732096 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1318619 data_alloc: 234881024 data_used: 15642624
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:53.527411+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103424000 unmapped: 9732096 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:54.527603+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103424000 unmapped: 9732096 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa2b2000/0x0/0x4ffc00000, data 0x28aa683/0x296a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172661400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172661400 session 0x55d16f9e6960
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:55.527751+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103055360 unmapped: 10100736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:56.527910+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103055360 unmapped: 10100736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17035d000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:57.528047+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 104669184 unmapped: 8486912 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1355120 data_alloc: 234881024 data_used: 20631552
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:58.528184+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108175360 unmapped: 4980736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17019ab40
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:06:59.528327+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108175360 unmapped: 4980736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:00.528500+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108175360 unmapped: 4980736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa2b1000/0x0/0x4ffc00000, data 0x28aa683/0x296a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:01.528664+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108175360 unmapped: 4980736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:02.528794+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108175360 unmapped: 4980736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1360592 data_alloc: 234881024 data_used: 21483520
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:03.528918+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.041341782s of 14.129011154s, submitted: 19
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108175360 unmapped: 4980736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa2b1000/0x0/0x4ffc00000, data 0x28aa683/0x296a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:04.529081+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108208128 unmapped: 4947968 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa2af000/0x0/0x4ffc00000, data 0x28ad683/0x296d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:05.529162+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 4915200 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:06.529346+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108249088 unmapped: 4907008 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa2af000/0x0/0x4ffc00000, data 0x28ad683/0x296d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:07.529462+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108249088 unmapped: 4907008 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1360332 data_alloc: 234881024 data_used: 21483520
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:08.529597+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109010944 unmapped: 4145152 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:09.529766+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115154944 unmapped: 2949120 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:10.529966+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 112525312 unmapped: 5578752 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:11.530108+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113344512 unmapped: 4759552 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f83e5000/0x0/0x4ffc00000, data 0x31b9683/0x3279000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:12.530235+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113393664 unmapped: 4710400 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1447918 data_alloc: 234881024 data_used: 22425600
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:13.530385+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113393664 unmapped: 4710400 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:14.530537+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113393664 unmapped: 4710400 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:15.530681+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113582080 unmapped: 4521984 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042ac00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.826783180s of 12.373358727s, submitted: 98
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:16.530859+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f83e5000/0x0/0x4ffc00000, data 0x31b9683/0x3279000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:17.531005+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1442638 data_alloc: 234881024 data_used: 22425600
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:18.531196+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:19.531393+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:20.531565+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:21.531725+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f83ea000/0x0/0x4ffc00000, data 0x31c2683/0x3282000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:22.531931+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17035d000 session 0x55d17289a000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a000 session 0x55d1703c6d20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17250dc00 session 0x55d1730cab40
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:23.532068+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1370749 data_alloc: 234881024 data_used: 18501632
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042ac00 session 0x55d170d8c1e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d172e48000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:24.532256+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:25.532808+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:26.533501+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:27.533837+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f905a000/0x0/0x4ffc00000, data 0x2327683/0x23e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:28.534389+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1285633 data_alloc: 234881024 data_used: 15642624
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.467782021s of 12.590607643s, submitted: 41
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:29.534762+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9058000/0x0/0x4ffc00000, data 0x2327683/0x23e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:30.535279+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9058000/0x0/0x4ffc00000, data 0x2327683/0x23e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d172670f00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c000 session 0x55d172ba1860
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:31.535434+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109109248 unmapped: 8994816 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:32.535974+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9283000/0x0/0x4ffc00000, data 0x2327683/0x23e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [0,1])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106586112 unmapped: 11517952 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1728970e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:33.536094+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1166733 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:34.536286+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:35.536549+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:36.537000+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:37.537204+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:38.537502+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1166733 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:39.537694+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17035d000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.436046600s of 11.544039726s, submitted: 32
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:40.537963+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:41.538173+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:42.538356+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:43.538549+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1168245 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:44.538778+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:45.538934+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:46.539216+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:47.539406+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:48.539554+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1168245 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:49.539805+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:50.540038+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:51.540837+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:52.541669+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:53.541816+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1168245 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:54.542011+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:55.542165+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:56.542398+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:57.543171+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:58.553752+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1168245 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:07:59.554514+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:00.554665+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:01.554873+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a000 session 0x55d172ba0780
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172ba1a40
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d172ba1680
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:02.555100+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d172ba14a0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254c000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c000 session 0x55d172ba0d20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:03.555356+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17250dc00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17250dc00 session 0x55d1701881e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 23.435865402s of 23.439153671s, submitted: 1
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170045 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d170188f00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d170d8c960
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d170d8c000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254c000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c000 session 0x55d170d8c5a0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172993000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172993000 session 0x55d16f9e6960
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:04.556133+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:05.556608+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d16f9e7a40
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:06.556966+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9d30000/0x0/0x4ffc00000, data 0x187b693/0x193c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:07.557298+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d16f9e6000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:08.557625+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193785 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d16f9e65a0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254c000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c000 session 0x55d172ee6b40
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:09.557761+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172096400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:10.557878+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169c00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106422272 unmapped: 13852672 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9d30000/0x0/0x4ffc00000, data 0x187b693/0x193c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:11.558128+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:12.558533+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:13.558865+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212814 data_alloc: 234881024 data_used: 14893056
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:14.559184+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9d30000/0x0/0x4ffc00000, data 0x187b693/0x193c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:15.559343+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:16.559505+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:17.559666+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:18.559819+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212814 data_alloc: 234881024 data_used: 14893056
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:19.560128+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9d30000/0x0/0x4ffc00000, data 0x187b693/0x193c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 13254656 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:20.560271+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 13254656 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:21.560428+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9d30000/0x0/0x4ffc00000, data 0x187b693/0x193c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 13254656 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:22.560635+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.067176819s of 19.197723389s, submitted: 13
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 10002432 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9d30000/0x0/0x4ffc00000, data 0x187b693/0x193c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:23.560797+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271222 data_alloc: 234881024 data_used: 15110144
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109248512 unmapped: 11026432 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9553000/0x0/0x4ffc00000, data 0x2052693/0x2113000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:24.560963+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:25.561133+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:26.561309+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:27.561506+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:28.561771+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278384 data_alloc: 234881024 data_used: 14929920
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:29.561967+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:30.562142+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:31.562364+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:32.562607+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:33.562820+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278384 data_alloc: 234881024 data_used: 14929920
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:34.562958+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:35.563107+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:36.563272+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:37.563491+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:38.563668+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278384 data_alloc: 234881024 data_used: 14929920
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:39.563815+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:40.563969+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:41.564180+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:42.565832+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:43.565972+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278384 data_alloc: 234881024 data_used: 14929920
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:44.566118+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:45.566251+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:46.566380+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:47.566504+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:48.566633+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278384 data_alloc: 234881024 data_used: 14929920
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:49.566840+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:50.566989+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:51.567092+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109895680 unmapped: 10379264 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:52.567227+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096400 session 0x55d17263d2c0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 30.033960342s of 30.179452896s, submitted: 56
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169c00 session 0x55d1730ca5a0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109903872 unmapped: 10371072 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:53.567362+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172670000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1176888 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:54.567648+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:55.567794+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:56.567949+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:57.568097+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:58.568241+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1176888 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:08:59.568385+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:00.568577+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:01.568696+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:02.568887+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:03.569018+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1176888 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:04.569184+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:05.569316+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:06.569472+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:07.569590+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:08.569747+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1176888 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:09.569880+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:10.570004+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:11.570138+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:12.570270+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:13.570450+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1176888 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:14.570619+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:15.570753+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:16.570896+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:17.570991+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d172effe00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d1726e9680
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254c000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c000 session 0x55d172b9e1e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172e81c20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 25.123188019s of 25.234869003s, submitted: 31
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 116154368 unmapped: 21454848 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d172672780
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d172896f00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169c00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169c00 session 0x55d170400d20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172997800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:18.571106+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172997800 session 0x55d170400000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1703e32c0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1292908 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:19.571243+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:20.571399+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:21.571576+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:22.571720+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:23.571853+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1292908 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:24.571975+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:25.572121+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d170047680
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:26.572274+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169c00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:27.572396+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 19595264 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:28.572623+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1402500 data_alloc: 251658240 data_used: 28434432
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 19595264 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:29.572749+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 19595264 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:30.572955+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 19595264 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:31.573139+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 19595264 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:32.573278+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 19562496 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:33.573418+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1402500 data_alloc: 251658240 data_used: 28434432
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 19562496 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:34.573629+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 19562496 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:35.573771+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 19562496 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:36.573929+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 19562496 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:37.574063+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 20.038774490s of 20.122617722s, submitted: 17
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 17965056 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:38.574179+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1445412 data_alloc: 251658240 data_used: 28798976
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122085376 unmapped: 15523840 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:39.574326+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122789888 unmapped: 14819328 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:40.574449+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8a3b000/0x0/0x4ffc00000, data 0x2b71683/0x2c31000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122822656 unmapped: 14786560 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:41.574632+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122822656 unmapped: 14786560 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:42.574785+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122822656 unmapped: 14786560 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:43.574916+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1458516 data_alloc: 251658240 data_used: 29515776
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122822656 unmapped: 14786560 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:44.575040+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122822656 unmapped: 14786560 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:45.575154+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8a2b000/0x0/0x4ffc00000, data 0x2b81683/0x2c41000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122855424 unmapped: 14753792 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:46.575458+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122855424 unmapped: 14753792 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:47.575734+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122863616 unmapped: 14745600 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:48.575930+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1458516 data_alloc: 251658240 data_used: 29515776
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122863616 unmapped: 14745600 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:49.576080+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122863616 unmapped: 14745600 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:50.576200+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8a2b000/0x0/0x4ffc00000, data 0x2b81683/0x2c41000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122896384 unmapped: 14712832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:51.576402+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.1 total, 600.0 interval
                                           Cumulative writes: 8201 writes, 31K keys, 8201 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 8201 writes, 1912 syncs, 4.29 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1363 writes, 4432 keys, 1363 commit groups, 1.0 writes per commit group, ingest: 4.48 MB, 0.01 MB/s
                                           Interval WAL: 1363 writes, 582 syncs, 2.34 writes per sync, written: 0.00 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122896384 unmapped: 14712832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:52.576587+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122896384 unmapped: 14712832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:53.576744+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8a2b000/0x0/0x4ffc00000, data 0x2b81683/0x2c41000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1458516 data_alloc: 251658240 data_used: 29515776
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122896384 unmapped: 14712832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:54.576892+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122896384 unmapped: 14712832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:55.577028+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122896384 unmapped: 14712832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:56.577207+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172494400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d173144f00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254c400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c400 session 0x55d172037a40
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254d000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254d000 session 0x55d1730ca5a0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254d000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254d000 session 0x55d1730caf00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 18.439634323s of 18.733009338s, submitted: 55
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1730cbc20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d172896960
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172494400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d172ee7c20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254c400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c400 session 0x55d170188960
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 15179776 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:57.577342+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254c400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c400 session 0x55d172ee6b40
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:58.577498+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 15179776 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f87cb000/0x0/0x4ffc00000, data 0x2de06e5/0x2ea1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1476366 data_alloc: 251658240 data_used: 29515776
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:09:59.577687+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 15179776 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:00.577873+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 15179776 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f87cb000/0x0/0x4ffc00000, data 0x2de06e5/0x2ea1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:01.578037+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 15179776 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17259ab40
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:02.578184+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 121585664 unmapped: 16023552 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172494400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:03.578355+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 121585664 unmapped: 16023552 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f87ca000/0x0/0x4ffc00000, data 0x2de0708/0x2ea2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1481927 data_alloc: 251658240 data_used: 30072832
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:04.578466+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 121872384 unmapped: 15736832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:05.578628+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123502592 unmapped: 14106624 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:06.578855+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123502592 unmapped: 14106624 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:07.579012+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123502592 unmapped: 14106624 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:08.579158+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123502592 unmapped: 14106624 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1494239 data_alloc: 251658240 data_used: 31916032
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:09.579392+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123502592 unmapped: 14106624 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f87ca000/0x0/0x4ffc00000, data 0x2de0708/0x2ea2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:10.579579+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123510784 unmapped: 14098432 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.753458977s of 13.886870384s, submitted: 41
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:11.579768+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 14065664 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:12.579931+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 14065664 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:13.580067+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 14065664 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f87c8000/0x0/0x4ffc00000, data 0x2de1708/0x2ea3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1494831 data_alloc: 251658240 data_used: 31920128
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:14.580251+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 14065664 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:15.580424+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131710976 unmapped: 5898240 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:16.580654+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 133734400 unmapped: 3874816 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7899000/0x0/0x4ffc00000, data 0x3d11708/0x3dd3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:17.580778+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 132063232 unmapped: 5545984 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:18.580964+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 132063232 unmapped: 5545984 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1622323 data_alloc: 251658240 data_used: 33873920
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7872000/0x0/0x4ffc00000, data 0x3d38708/0x3dfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:19.581127+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 132063232 unmapped: 5545984 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7872000/0x0/0x4ffc00000, data 0x3d38708/0x3dfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:20.581263+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 132063232 unmapped: 5545984 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:21.581465+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 132063232 unmapped: 5545984 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.394704819s of 10.849593163s, submitted: 125
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:22.581655+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 132079616 unmapped: 5529600 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d172670000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d1720374a0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254d000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f786f000/0x0/0x4ffc00000, data 0x3d3b708/0x3dfd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:23.581814+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254d000 session 0x55d1703c65a0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 7643136 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1463639 data_alloc: 251658240 data_used: 29515776
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f842a000/0x0/0x4ffc00000, data 0x2b826a6/0x2c43000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:24.581986+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 7643136 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:25.582168+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 7643136 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:26.582330+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 7643136 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:27.582465+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 7643136 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:28.582650+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 7634944 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1463639 data_alloc: 251658240 data_used: 29515776
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d170047c20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169c00 session 0x55d17259b680
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:29.582806+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 7634944 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254d000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254d000 session 0x55d172671860
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f842a000/0x0/0x4ffc00000, data 0x2b82683/0x2c42000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:30.583027+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:31.583235+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:32.583449+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:33.583637+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197707 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:34.583814+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:35.583958+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:36.584141+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:37.584320+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:38.584430+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197707 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:39.584753+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:40.585023+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:41.585679+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:42.585935+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:43.586281+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197707 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:44.587342+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:45.587794+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:46.588369+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:47.588547+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:48.588797+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197707 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:49.589609+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:50.589863+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:51.590164+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:52.590533+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:53.590787+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197707 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:54.591017+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:55.591246+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:56.591598+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 34.995807648s of 35.219715118s, submitted: 71
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1721d2000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d170401c20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d170c97a40
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d1703c54a0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169c00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169c00 session 0x55d1703c4780
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:57.591823+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9e2d000/0x0/0x4ffc00000, data 0x177e6e5/0x183f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:58.592001+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1218652 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:10:59.592222+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:00.592428+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:01.592625+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9e2d000/0x0/0x4ffc00000, data 0x177e6e5/0x183f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:02.592767+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9e2d000/0x0/0x4ffc00000, data 0x177e6e5/0x183f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:03.593060+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254d000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254d000 session 0x55d1726e8960
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1218652 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:04.593219+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172494400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254c400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:05.593348+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:06.593573+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 22233088 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:07.593755+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:08.593946+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9e2d000/0x0/0x4ffc00000, data 0x177e6e5/0x183f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1222452 data_alloc: 234881024 data_used: 12705792
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:09.594116+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:10.594320+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:11.594546+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:12.594800+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:13.594957+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9e2d000/0x0/0x4ffc00000, data 0x177e6e5/0x183f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1222452 data_alloc: 234881024 data_used: 12705792
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:14.595141+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:15.595299+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:16.595511+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.421459198s of 19.482046127s, submitted: 29
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:17.595723+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117194752 unmapped: 20414464 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:18.595912+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119021568 unmapped: 18587648 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268948 data_alloc: 234881024 data_used: 13053952
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:19.596107+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120135680 unmapped: 17473536 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:20.596285+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120143872 unmapped: 17465344 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:21.596481+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120143872 unmapped: 17465344 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:22.596785+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120143872 unmapped: 17465344 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:23.596960+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120143872 unmapped: 17465344 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268948 data_alloc: 234881024 data_used: 13053952
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:24.597120+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:25.597303+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:26.597494+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:27.597653+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:28.597832+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268948 data_alloc: 234881024 data_used: 13053952
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:29.598029+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:30.598251+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:31.598496+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:32.598685+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:33.598921+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268948 data_alloc: 234881024 data_used: 13053952
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:34.599075+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:35.599228+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:36.599428+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:37.599616+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:38.599806+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268948 data_alloc: 234881024 data_used: 13053952
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:39.599969+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:40.600149+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:41.600347+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:42.600512+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:43.600768+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:44.600948+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268948 data_alloc: 234881024 data_used: 13053952
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:45.601239+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:46.601430+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:47.601636+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120168448 unmapped: 17440768 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:48.601789+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120168448 unmapped: 17440768 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 31.392076492s of 32.704330444s, submitted: 79
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:49.601914+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1262116 data_alloc: 234881024 data_used: 13058048
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 18792448 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:50.602038+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118939648 unmapped: 18669568 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:51.602202+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118939648 unmapped: 18669568 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f98a7000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:52.602371+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118939648 unmapped: 18669568 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:53.602573+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118939648 unmapped: 18669568 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:54.602802+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1262116 data_alloc: 234881024 data_used: 13058048
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f98a7000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118939648 unmapped: 18669568 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:55.603028+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118947840 unmapped: 18661376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:56.603219+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118947840 unmapped: 18661376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d1721fde00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:57.603402+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d1721fc3c0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1721fd680
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d17266f680
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169c00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169c00 session 0x55d17266fa40
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172164000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d17266e960
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172164000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d17266f0e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119488512 unmapped: 25468928 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17266e000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d17266e780
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f98a7000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:58.603535+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119488512 unmapped: 25468928 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:11:59.603754+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1302588 data_alloc: 234881024 data_used: 13058048
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.978124619s of 10.595539093s, submitted: 169
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f5000/0x0/0x4ffc00000, data 0x21b66e5/0x2277000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 25534464 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:00.603935+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 25534464 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:01.604117+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 25534464 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f5000/0x0/0x4ffc00000, data 0x21b66e5/0x2277000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:02.604245+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 25534464 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:03.604386+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f5000/0x0/0x4ffc00000, data 0x21b66e5/0x2277000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 25534464 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169c00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169c00 session 0x55d17266fc20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:04.604582+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1304100 data_alloc: 234881024 data_used: 13041664
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d17266f4a0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119439360 unmapped: 25518080 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:05.604765+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119439360 unmapped: 25518080 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:06.605006+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f5000/0x0/0x4ffc00000, data 0x21b66e5/0x2277000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119447552 unmapped: 25509888 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:07.605153+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d17266e1e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119447552 unmapped: 25509888 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d173145e00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:08.605800+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172164000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172169800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 25501696 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:09.605976+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1303553 data_alloc: 234881024 data_used: 13041664
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 25501696 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f4000/0x0/0x4ffc00000, data 0x21b6708/0x2278000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:10.606170+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 25501696 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:11.606356+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 25501696 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:12.606485+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 25501696 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f4000/0x0/0x4ffc00000, data 0x21b6708/0x2278000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:13.606674+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:14.606811+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1314801 data_alloc: 234881024 data_used: 14536704
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f4000/0x0/0x4ffc00000, data 0x21b6708/0x2278000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:15.607006+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a800 session 0x55d172659680
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:16.607196+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:17.607355+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:18.607502+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:19.607668+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1335625 data_alloc: 234881024 data_used: 17674240
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f4000/0x0/0x4ffc00000, data 0x21b6708/0x2278000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f4000/0x0/0x4ffc00000, data 0x21b6708/0x2278000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:20.607822+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:21.607974+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:22.608155+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:23.608331+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:24.608488+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1335625 data_alloc: 234881024 data_used: 17674240
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 25.355880737s of 25.392654419s, submitted: 18
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:25.608683+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8a2c000/0x0/0x4ffc00000, data 0x2b76708/0x2c38000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 22511616 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:26.608933+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172162400 session 0x55d1730ca1e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172166800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 22511616 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:27.609121+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122363904 unmapped: 22593536 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:28.609344+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122363904 unmapped: 22593536 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:29.609545+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1409087 data_alloc: 234881024 data_used: 17907712
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f899e000/0x0/0x4ffc00000, data 0x2c0c708/0x2cce000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,0,0,1,1])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122363904 unmapped: 22593536 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:30.609762+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122634240 unmapped: 22323200 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:31.609939+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122855424 unmapped: 22102016 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:32.610073+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122871808 unmapped: 22085632 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:33.610280+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122757120 unmapped: 22200320 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f891a000/0x0/0x4ffc00000, data 0x2c90708/0x2d52000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:34.610588+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1419099 data_alloc: 234881024 data_used: 17899520
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122822656 unmapped: 22134784 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 4.809408665s of 10.088632584s, submitted: 133
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:35.610761+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122904576 unmapped: 22052864 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:36.611029+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122912768 unmapped: 22044672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:37.611179+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f891a000/0x0/0x4ffc00000, data 0x2c90708/0x2d52000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 22036480 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:38.611293+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f88f9000/0x0/0x4ffc00000, data 0x2cb1708/0x2d73000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 22036480 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:39.611572+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1420491 data_alloc: 234881024 data_used: 17899520
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122929152 unmapped: 22028288 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:40.611891+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d1721d2000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d1703c61e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122937344 unmapped: 22020096 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:41.612133+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17259b680
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:42.612298+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:43.612486+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:44.612636+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1273109 data_alloc: 234881024 data_used: 13041664
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f98a6000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:45.612766+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:46.612962+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:47.615603+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f98a6000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:48.619106+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.456459999s of 13.937705994s, submitted: 67
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d1703c6960
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c400 session 0x55d1725841e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:49.620178+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1272745 data_alloc: 234881024 data_used: 13041664
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a800 session 0x55d1726701e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:50.621207+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:51.621365+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:52.621727+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:53.623234+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:54.623448+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:55.623988+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:56.624524+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:57.624778+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:58.624957+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:12:59.625352+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:00.625649+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:01.625836+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:02.625964+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:03.626111+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:04.626302+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:05.626681+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:06.626932+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:07.627192+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:08.627437+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:09.627676+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:10.627920+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:11.628183+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:12.628445+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:13.628603+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:14.628851+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:15.629067+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:16.629335+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:17.629609+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:18.629848+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:19.630098+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:20.631352+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:21.632359+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:22.633282+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:23.634081+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:24.634875+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:25.635551+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:26.636091+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:27.636384+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:28.636771+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:29.637380+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:30.638063+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:31.638614+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:32.639199+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:33.639611+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:34.639792+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172164000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:35.639898+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 46.250705719s of 46.323696136s, submitted: 22
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d1703c4f00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d170d8dc20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 25116672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a800 session 0x55d17263c780
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172164000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d170d8d2c0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172494400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d172895860
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:36.640439+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9eff000/0x0/0x4ffc00000, data 0x16ad683/0x176d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 25116672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:37.640820+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 25116672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9eff000/0x0/0x4ffc00000, data 0x16ad683/0x176d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:38.641202+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 25116672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:39.641536+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 25116672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224861 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:40.641667+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9eff000/0x0/0x4ffc00000, data 0x16ad683/0x176d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 25116672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17254c400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c400 session 0x55d17263d680
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1730cbc20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:41.641773+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119824384 unmapped: 25133056 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a800 session 0x55d170046780
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172164000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:42.641930+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d1721fc780
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118849536 unmapped: 26107904 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172494400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:43.642202+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118734848 unmapped: 26222592 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:44.642358+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 26214400 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1229754 data_alloc: 234881024 data_used: 12795904
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:45.642588+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 26214400 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:46.642954+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9eff000/0x0/0x4ffc00000, data 0x16ad683/0x176d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 26214400 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d1720954a0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.380515099s of 11.418202400s, submitted: 13
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d17263c960
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:47.643590+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17266fe00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118751232 unmapped: 26206208 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:48.643953+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118751232 unmapped: 26206208 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: mgrc ms_handle_reset ms_handle_reset con 0x55d171202c00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/2891176105
Jan 26 10:29:53 compute-1 ceph-osd[77632]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/2891176105,v1:192.168.122.100:6801/2891176105]
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: get_auth_request con 0x55d171202800 auth_method 0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: mgrc handle_mgr_configure stats_period=5
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:49.644151+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1216030 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172167c00 session 0x55d1725c45a0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17116ac00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:50.644309+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164400 session 0x55d172eff4a0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172507800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:51.644480+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:52.644748+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:53.644977+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:54.645137+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1216030 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:55.645601+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:56.645965+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:57.646115+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:58.646375+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:13:59.646643+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1216030 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:00.646917+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:01.647067+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172164400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.299061775s of 15.430803299s, submitted: 25
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:02.647207+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164400 session 0x55d172ee72c0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a800 session 0x55d1721fc780
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172164000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d1703c5e00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172494400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d1728943c0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172895860
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 36143104 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:03.647436+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 36143104 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:04.647666+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 36143104 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:05.648028+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 36143104 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:06.648221+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 36143104 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f955c000/0x0/0x4ffc00000, data 0x2050683/0x2110000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:07.648416+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f955c000/0x0/0x4ffc00000, data 0x2050683/0x2110000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17042a800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a800 session 0x55d1703c70e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 36143104 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172164000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d17259af00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:08.648613+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118349824 unmapped: 36126720 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172164400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164400 session 0x55d17259a000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:09.648804+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d1726701e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f955a000/0x0/0x4ffc00000, data 0x20506b6/0x2112000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117383168 unmapped: 37093376 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301098 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:10.649036+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117170176 unmapped: 37306368 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:11.649199+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123084800 unmapped: 31391744 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d1703c14a0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17293a780
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:12.649409+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172993800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.107300758s of 10.256335258s, submitted: 30
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117915648 unmapped: 36560896 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172993800 session 0x55d16f7d4000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:13.649553+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:14.649749+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225121 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:15.649910+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:16.650116+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:17.650251+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:18.650425+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:19.650601+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225121 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:20.650776+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:21.650929+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:22.651100+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:23.651295+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:24.651461+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 36536320 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225121 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:25.651625+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 36536320 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:26.652009+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 36536320 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:27.652664+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 36536320 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:28.653162+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 36536320 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:29.653750+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225121 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:30.653899+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:31.654330+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:32.654969+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:33.655416+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:34.656020+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225121 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:35.656406+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:36.656842+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:37.657160+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:38.657343+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:39.657591+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225121 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:40.657968+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:41.658253+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:42.658661+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172528000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528000 session 0x55d17266fe00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172dc3c00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d173145c20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172dc3c00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d173144960
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172659680
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:43.659053+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 30.658174515s of 30.887229919s, submitted: 38
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d1731441e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172528000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528000 session 0x55d172095c20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172993800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172993800 session 0x55d172897680
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172993800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172993800 session 0x55d170c97a40
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172894d20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117514240 unmapped: 38584320 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:44.659287+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117514240 unmapped: 38584320 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1293569 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:45.659516+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117514240 unmapped: 38584320 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:46.659864+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d172894b40
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117514240 unmapped: 38584320 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172528000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528000 session 0x55d172896960
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:47.660063+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172dc3c00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d172095860
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172dc3c00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117555200 unmapped: 38543360 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d172ee70e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:48.660341+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f962c000/0x0/0x4ffc00000, data 0x1f80683/0x2040000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117858304 unmapped: 38240256 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:49.660784+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117858304 unmapped: 38240256 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1320800 data_alloc: 234881024 data_used: 15335424
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:50.660990+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120201216 unmapped: 35897344 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:51.661223+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x1fa46b6/0x2066000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120201216 unmapped: 35897344 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:52.661455+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120201216 unmapped: 35897344 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x1fa46b6/0x2066000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:53.661891+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x1fa46b6/0x2066000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120201216 unmapped: 35897344 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:54.662209+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x1fa46b6/0x2066000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120201216 unmapped: 35897344 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1359864 data_alloc: 234881024 data_used: 21143552
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:55.662435+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120201216 unmapped: 35897344 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:56.662682+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120209408 unmapped: 35889152 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:57.662910+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120209408 unmapped: 35889152 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:58.663116+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120209408 unmapped: 35889152 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:14:59.663334+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x1fa46b6/0x2066000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120209408 unmapped: 35889152 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1359864 data_alloc: 234881024 data_used: 21143552
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:00.663485+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120209408 unmapped: 35889152 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.557689667s of 17.637289047s, submitted: 14
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:01.663644+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 32169984 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:02.663838+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124157952 unmapped: 31940608 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:03.664058+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f919b000/0x0/0x4ffc00000, data 0x240f6b6/0x24d1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124321792 unmapped: 31776768 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:04.664245+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124329984 unmapped: 31768576 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1417424 data_alloc: 234881024 data_used: 21848064
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:05.664371+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:06.664617+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:07.664895+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9169000/0x0/0x4ffc00000, data 0x24406b6/0x2502000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:08.665131+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:09.665327+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1420008 data_alloc: 234881024 data_used: 22011904
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:10.665454+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:11.665639+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:12.665864+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172036f00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d172585e00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:13.666041+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9167000/0x0/0x4ffc00000, data 0x24436b6/0x2505000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172097800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.894859314s of 12.680984497s, submitted: 52
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:14.666285+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172097800 session 0x55d172e80b40
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1237102 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:15.666479+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:16.666871+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:17.667124+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:18.667356+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:19.667556+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1237102 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:20.667769+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:21.668358+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:22.668597+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:23.668780+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:24.669038+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1237102 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:25.669212+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:26.669505+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:27.669745+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:28.669905+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:29.670677+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:30.670957+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1237102 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:31.672897+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:32.675825+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:33.676057+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:34.678385+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 38641664 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:35.678590+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1237102 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 38641664 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:36.679036+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 38641664 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:37.679292+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172ae9000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 23.971050262s of 24.083293915s, submitted: 28
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172ae9000 session 0x55d17311c1e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172897a40
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172097800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172097800 session 0x55d172095c20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d1726e83c0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172dc3c00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d172efed20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117735424 unmapped: 46235648 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:38.679448+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117735424 unmapped: 46235648 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:39.679684+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 46227456 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:40.679933+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1353511 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9045000/0x0/0x4ffc00000, data 0x2567683/0x2627000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 46227456 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:41.680103+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 46227456 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:42.680284+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9045000/0x0/0x4ffc00000, data 0x2567683/0x2627000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 46219264 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:43.680444+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172163400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172163400 session 0x55d173144960
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172163400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172163400 session 0x55d1725845a0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 46219264 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:44.680652+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117760000 unmapped: 46211072 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:45.680812+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1353511 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9045000/0x0/0x4ffc00000, data 0x2567683/0x2627000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1721d3a40
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172097800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172097800 session 0x55d172e81e00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118374400 unmapped: 45596672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:46.681274+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172dc3c00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118382592 unmapped: 45588480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:47.681731+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123191296 unmapped: 40779776 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:48.682139+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128458752 unmapped: 35512320 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:49.682586+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9020000/0x0/0x4ffc00000, data 0x258b693/0x264c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128491520 unmapped: 35479552 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:50.682895+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1468282 data_alloc: 251658240 data_used: 28352512
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128491520 unmapped: 35479552 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:51.683332+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128499712 unmapped: 35471360 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:52.683817+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128499712 unmapped: 35471360 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:53.684212+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9020000/0x0/0x4ffc00000, data 0x258b693/0x264c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128532480 unmapped: 35438592 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:54.684494+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128532480 unmapped: 35438592 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:55.684647+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1468282 data_alloc: 251658240 data_used: 28352512
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128532480 unmapped: 35438592 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:56.684886+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9020000/0x0/0x4ffc00000, data 0x258b693/0x264c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128532480 unmapped: 35438592 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:57.685281+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128548864 unmapped: 35422208 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:58.685655+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172ae8c00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172ae8c00 session 0x55d170d8d0e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172096800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d170d8dc20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 20.590911865s of 20.736030579s, submitted: 31
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d170401e00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172096800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d172673680
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172097800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172097800 session 0x55d172672f00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172528c00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528c00 session 0x55d1726730e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172c29800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172c29800 session 0x55d17034e1e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172672d20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172096800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d16f9e63c0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137723904 unmapped: 26247168 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:15:59.685904+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 25911296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:00.686058+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1635802 data_alloc: 251658240 data_used: 29327360
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 25911296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:01.686269+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7e20000/0x0/0x4ffc00000, data 0x3781705/0x3844000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 25911296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:02.686479+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 25911296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:03.686603+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172097800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 138067968 unmapped: 25903104 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:04.686762+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172097800 session 0x55d17259b680
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172528c00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d17118a400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 136216576 unmapped: 27754496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:05.686940+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1632727 data_alloc: 251658240 data_used: 29339648
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137707520 unmapped: 26263552 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:06.687112+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7e04000/0x0/0x4ffc00000, data 0x37a5705/0x3868000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:07.687233+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:08.687416+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:09.687551+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7e04000/0x0/0x4ffc00000, data 0x37a5705/0x3868000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7e04000/0x0/0x4ffc00000, data 0x37a5705/0x3868000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:10.687714+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1686383 data_alloc: 251658240 data_used: 34185216
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:11.687834+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7e04000/0x0/0x4ffc00000, data 0x37a5705/0x3868000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:12.688042+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:13.688208+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139853824 unmapped: 24117248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:14.688393+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139853824 unmapped: 24117248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:15.688533+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7e04000/0x0/0x4ffc00000, data 0x37a5705/0x3868000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1686383 data_alloc: 251658240 data_used: 34185216
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.407573700s of 17.746696472s, submitted: 131
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 140615680 unmapped: 23355392 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:16.688724+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143704064 unmapped: 20267008 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:17.688908+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f737e000/0x0/0x4ffc00000, data 0x4223705/0x42e6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143548416 unmapped: 20422656 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:18.689058+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143171584 unmapped: 20799488 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:19.689208+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7304000/0x0/0x4ffc00000, data 0x42a5705/0x4368000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143171584 unmapped: 20799488 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:20.689343+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7304000/0x0/0x4ffc00000, data 0x42a5705/0x4368000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1784443 data_alloc: 251658240 data_used: 35086336
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:21.689451+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143171584 unmapped: 20799488 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:22.689606+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143179776 unmapped: 20791296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:23.689781+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143179776 unmapped: 20791296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7304000/0x0/0x4ffc00000, data 0x42a5705/0x4368000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:24.690009+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143212544 unmapped: 20758528 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:25.690187+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143212544 unmapped: 20758528 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1785315 data_alloc: 251658240 data_used: 35086336
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:26.690368+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f72e3000/0x0/0x4ffc00000, data 0x42c6705/0x4389000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143245312 unmapped: 20725760 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:27.690565+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143245312 unmapped: 20725760 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:28.690732+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143269888 unmapped: 20701184 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f72e3000/0x0/0x4ffc00000, data 0x42c6705/0x4389000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:29.690882+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143269888 unmapped: 20701184 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:30.691054+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143269888 unmapped: 20701184 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1786531 data_alloc: 251658240 data_used: 35164160
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.249229431s of 14.751939774s, submitted: 97
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:31.691242+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143269888 unmapped: 20701184 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f72e3000/0x0/0x4ffc00000, data 0x42c6705/0x4389000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:32.691368+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 142999552 unmapped: 20971520 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:33.691572+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143065088 unmapped: 20905984 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:34.692287+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143065088 unmapped: 20905984 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528c00 session 0x55d17311c780
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17118a400 session 0x55d16f7d4780
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:35.692979+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143065088 unmapped: 20905984 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1786215 data_alloc: 251658240 data_used: 35164160
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1704003c0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:36.693410+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137355264 unmapped: 26615808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:37.693922+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137355264 unmapped: 26615808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7f93000/0x0/0x4ffc00000, data 0x2fb8693/0x3079000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:38.694396+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137355264 unmapped: 26615808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:39.694558+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137371648 unmapped: 26599424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:40.695067+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137371648 unmapped: 26599424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1579255 data_alloc: 234881024 data_used: 26243072
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:41.695482+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137371648 unmapped: 26599424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.313569069s of 11.179004669s, submitted: 65
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d172ee61e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d17311c000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:42.703118+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137371648 unmapped: 26599424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172096800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:43.703302+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f81e3000/0x0/0x4ffc00000, data 0x2fb8693/0x3079000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137388032 unmapped: 26583040 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:44.703665+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125648896 unmapped: 38322176 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:45.707252+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1275770 data_alloc: 234881024 data_used: 12288000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:46.707571+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [0,0,1])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d172eff4a0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:47.708069+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:48.708361+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:49.708651+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:50.708989+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1269126 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:51.709332+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:52.709669+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:53.709958+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:54.710117+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:55.710367+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1269126 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:56.710675+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:57.710922+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:58.711183+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:16:59.711347+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:00.711606+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1269126 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:01.711844+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:02.712089+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:03.712292+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:04.712591+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:05.712812+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1269126 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:06.713001+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:07.713221+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:08.713417+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:09.713573+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172097800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172097800 session 0x55d172efe3c0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1726725a0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172096800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d1726590e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d172659a40
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172dc3c00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:10.713790+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 25.249382019s of 28.619909286s, submitted: 55
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1371866 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d172e80960
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172528c00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528c00 session 0x55d17311c3c0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172528c00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528c00 session 0x55d172585c20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17259af00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:11.714001+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172096800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d172894000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 38215680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:12.714198+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 38215680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9067000/0x0/0x4ffc00000, data 0x2135683/0x21f5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:13.714413+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 38215680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:14.902982+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 38215680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:15.903133+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 38215680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1352162 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:16.903309+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 38215680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9067000/0x0/0x4ffc00000, data 0x2135683/0x21f5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d172673e00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:17.903474+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125812736 unmapped: 38158336 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172dc3c00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d171202000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:18.903746+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125812736 unmapped: 38158336 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:19.903894+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:20.904081+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1432927 data_alloc: 234881024 data_used: 24039424
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:21.904233+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9066000/0x0/0x4ffc00000, data 0x21356a6/0x21f6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:22.904443+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:23.904884+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9066000/0x0/0x4ffc00000, data 0x21356a6/0x21f6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:24.905026+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:25.905197+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9066000/0x0/0x4ffc00000, data 0x21356a6/0x21f6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1432927 data_alloc: 234881024 data_used: 24039424
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:26.905440+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:27.905595+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:28.905735+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9066000/0x0/0x4ffc00000, data 0x21356a6/0x21f6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 18.371334076s of 18.521051407s, submitted: 16
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:29.905878+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 130768896 unmapped: 33202176 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8f50000/0x0/0x4ffc00000, data 0x22456a6/0x2306000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:30.906057+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 130768896 unmapped: 33202176 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1444671 data_alloc: 234881024 data_used: 24260608
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:31.906229+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:32.906431+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8f2d000/0x0/0x4ffc00000, data 0x22666a6/0x2327000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:33.906577+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8f2d000/0x0/0x4ffc00000, data 0x22666a6/0x2327000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:34.906725+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8f2d000/0x0/0x4ffc00000, data 0x22666a6/0x2327000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:35.906917+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1457483 data_alloc: 234881024 data_used: 24080384
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:36.907099+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:37.907240+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:38.907395+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8f2d000/0x0/0x4ffc00000, data 0x22666a6/0x2327000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:39.907572+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:40.907793+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131129344 unmapped: 32841728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1457483 data_alloc: 234881024 data_used: 24080384
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:41.907957+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131129344 unmapped: 32841728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:42.908111+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131129344 unmapped: 32841728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d172e810e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202000 session 0x55d1726701e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:43.908288+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8f2d000/0x0/0x4ffc00000, data 0x22666a6/0x2327000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131129344 unmapped: 32841728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d16f7c1400
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:44.908483+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131137536 unmapped: 32833536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.647443771s of 15.735019684s, submitted: 39
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:45.908751+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124911616 unmapped: 39059456 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280707 data_alloc: 234881024 data_used: 12181504
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e26a6/0x16a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:46.908918+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17311cd20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:47.909091+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:48.909300+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:49.909451+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:50.909627+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:51.909846+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:52.910045+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:53.910218+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:54.910447+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:55.910671+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:56.911018+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:57.911211+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:58.911374+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:17:59.911783+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:00.911962+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:01.912104+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:02.912266+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:03.912465+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:04.912669+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:05.912782+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:06.913062+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:07.913241+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:08.913419+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:09.913576+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:10.913767+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:11.913944+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:12.914105+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:13.914242+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:14.914397+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:15.914541+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:16.914790+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:17.915022+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:18.915201+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:19.915377+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:20.915561+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:21.915743+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:22.915903+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:23.916031+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:24.916209+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:25.916356+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:26.916527+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:27.916630+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:28.916761+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:29.916888+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:30.917055+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:31.917190+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:32.917333+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:33.917517+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:34.917666+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:35.917780+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:36.917991+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:37.918162+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:38.918373+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:39.918551+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:40.918779+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:41.918972+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:42.919238+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:43.919817+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:44.920515+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:45.920823+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:46.922768+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:47.923927+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:48.924318+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:49.924834+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:50.925489+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:51.925857+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:52.926321+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:53.926868+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:54.927001+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:55.927169+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:56.927415+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:57.927591+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:58.927756+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:18:59.927914+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:00.928074+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:01.928191+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:02.928359+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:03.928513+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:04.928649+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:05.928778+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:06.928925+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: do_command 'config diff' '{prefix=config diff}'
Jan 26 10:29:53 compute-1 ceph-osd[77632]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 26 10:29:53 compute-1 ceph-osd[77632]: do_command 'config show' '{prefix=config show}'
Jan 26 10:29:53 compute-1 ceph-osd[77632]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125296640 unmapped: 38674432 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: do_command 'counter dump' '{prefix=counter dump}'
Jan 26 10:29:53 compute-1 ceph-osd[77632]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 26 10:29:53 compute-1 ceph-osd[77632]: do_command 'counter schema' '{prefix=counter schema}'
Jan 26 10:29:53 compute-1 ceph-osd[77632]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:07.929126+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124674048 unmapped: 39297024 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:08.929293+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: do_command 'log dump' '{prefix=log dump}'
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:09.929428+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Jan 26 10:29:53 compute-1 ceph-osd[77632]: do_command 'perf dump' '{prefix=perf dump}'
Jan 26 10:29:53 compute-1 ceph-osd[77632]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124837888 unmapped: 39133184 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Jan 26 10:29:53 compute-1 ceph-osd[77632]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Jan 26 10:29:53 compute-1 ceph-osd[77632]: do_command 'perf schema' '{prefix=perf schema}'
Jan 26 10:29:53 compute-1 ceph-osd[77632]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:10.929542+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124436480 unmapped: 39534592 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:11.929664+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124436480 unmapped: 39534592 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:12.929826+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124444672 unmapped: 39526400 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:13.929997+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124444672 unmapped: 39526400 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:14.930203+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124444672 unmapped: 39526400 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:15.930313+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124444672 unmapped: 39526400 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:16.930472+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124444672 unmapped: 39526400 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:17.930606+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124444672 unmapped: 39526400 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:18.930753+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124444672 unmapped: 39526400 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:19.930871+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124444672 unmapped: 39526400 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:20.930988+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124444672 unmapped: 39526400 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:21.931112+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124452864 unmapped: 39518208 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:22.931236+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124452864 unmapped: 39518208 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:23.931366+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124452864 unmapped: 39518208 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:24.931502+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124452864 unmapped: 39518208 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:25.931620+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124452864 unmapped: 39518208 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:26.931756+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124452864 unmapped: 39518208 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:27.931874+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124452864 unmapped: 39518208 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:28.932015+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124452864 unmapped: 39518208 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:29.932144+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124452864 unmapped: 39518208 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:30.932271+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124452864 unmapped: 39518208 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:31.932410+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124461056 unmapped: 39510016 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:32.932547+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124461056 unmapped: 39510016 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:33.932729+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124461056 unmapped: 39510016 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:34.932854+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124469248 unmapped: 39501824 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:35.932984+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124469248 unmapped: 39501824 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:36.933131+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124469248 unmapped: 39501824 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:37.933288+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124469248 unmapped: 39501824 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:38.933436+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124469248 unmapped: 39501824 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:39.933646+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124469248 unmapped: 39501824 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:40.933787+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124469248 unmapped: 39501824 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:41.934014+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124469248 unmapped: 39501824 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:42.934216+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 39493632 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:43.934433+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 39493632 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:44.934594+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 39493632 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:45.934824+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 39493632 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:46.935004+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 39493632 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:47.935191+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 39493632 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:48.935379+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 39493632 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:49.935626+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 39485440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:50.935879+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 39485440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2400.1 total, 600.0 interval
                                           Cumulative writes: 10K writes, 40K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s
                                           Cumulative WAL: 10K writes, 2923 syncs, 3.64 writes per sync, written: 0.03 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2426 writes, 8573 keys, 2426 commit groups, 1.0 writes per commit group, ingest: 9.03 MB, 0.02 MB/s
                                           Interval WAL: 2426 writes, 1011 syncs, 2.40 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:51.936074+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 39485440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:52.936212+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 39485440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:53.936418+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 39485440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:54.936673+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 39485440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:55.936912+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 39485440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:56.937112+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 39485440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:57.937366+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 39485440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:58.937554+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 39485440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:19:59.937770+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 39485440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:00.937969+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 39485440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:01.938156+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124493824 unmapped: 39477248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:02.938308+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124493824 unmapped: 39477248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:03.938556+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124493824 unmapped: 39477248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:04.938760+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124493824 unmapped: 39477248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:05.938966+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124493824 unmapped: 39477248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:06.939136+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124493824 unmapped: 39477248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:07.939356+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124493824 unmapped: 39477248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:08.939546+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124493824 unmapped: 39477248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:09.939732+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124493824 unmapped: 39477248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:10.939946+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124493824 unmapped: 39477248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:11.940158+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124502016 unmapped: 39469056 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:12.940358+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124502016 unmapped: 39469056 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:13.940544+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124502016 unmapped: 39469056 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:14.940729+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124502016 unmapped: 39469056 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:15.940907+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124502016 unmapped: 39469056 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:16.941122+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124502016 unmapped: 39469056 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:17.941335+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124502016 unmapped: 39469056 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:18.941624+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124502016 unmapped: 39469056 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:19.941941+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:20.942155+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:21.942369+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:22.942578+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:23.942784+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:24.944111+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:25.944362+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:26.944566+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:27.944767+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:28.944894+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:29.945111+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:30.945372+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:31.945511+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:32.945634+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:33.945805+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:34.945960+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:35.946084+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:36.946262+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:37.946398+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:38.946517+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:39.946662+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:40.946872+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:41.947059+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:42.947234+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:43.947393+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:44.947521+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:45.947683+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:46.947910+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:47.948057+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:48.948233+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124526592 unmapped: 39444480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:49.948377+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124526592 unmapped: 39444480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:50.948525+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124526592 unmapped: 39444480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:51.948786+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124526592 unmapped: 39444480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:52.948944+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124526592 unmapped: 39444480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:53.949068+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124526592 unmapped: 39444480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:54.949205+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124526592 unmapped: 39444480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:55.949369+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124526592 unmapped: 39444480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:56.949560+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124526592 unmapped: 39444480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:57.949747+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124526592 unmapped: 39444480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:58.949871+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124526592 unmapped: 39444480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:20:59.950043+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124534784 unmapped: 39436288 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:00.950166+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124534784 unmapped: 39436288 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:01.950292+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124534784 unmapped: 39436288 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:02.950391+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124534784 unmapped: 39436288 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:03.950510+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124534784 unmapped: 39436288 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:04.950634+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124534784 unmapped: 39436288 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:05.950774+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124534784 unmapped: 39436288 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:06.950941+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:07.951131+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124534784 unmapped: 39436288 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:08.951263+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124534784 unmapped: 39436288 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:09.951436+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124534784 unmapped: 39436288 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:10.951609+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124534784 unmapped: 39436288 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:11.951744+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124534784 unmapped: 39436288 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:12.951869+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124542976 unmapped: 39428096 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:13.951983+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124542976 unmapped: 39428096 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:14.952189+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124542976 unmapped: 39428096 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:15.952331+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124542976 unmapped: 39428096 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:16.952506+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124551168 unmapped: 39419904 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:17.952654+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124551168 unmapped: 39419904 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:18.952800+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124551168 unmapped: 39419904 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:19.952926+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124551168 unmapped: 39419904 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:20.953074+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124551168 unmapped: 39419904 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:21.953273+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124551168 unmapped: 39419904 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:22.953433+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124559360 unmapped: 39411712 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:23.953870+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124559360 unmapped: 39411712 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:24.954019+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124559360 unmapped: 39411712 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:25.954175+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124559360 unmapped: 39411712 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:26.954366+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124559360 unmapped: 39411712 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:27.954581+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124559360 unmapped: 39411712 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:28.954839+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124559360 unmapped: 39411712 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:29.955033+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124559360 unmapped: 39411712 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:30.955199+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124559360 unmapped: 39411712 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:31.955351+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124559360 unmapped: 39411712 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:32.955513+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124575744 unmapped: 39395328 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:33.955655+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124575744 unmapped: 39395328 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:34.955838+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124575744 unmapped: 39395328 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:35.956038+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124575744 unmapped: 39395328 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:36.956239+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124575744 unmapped: 39395328 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:37.956438+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124575744 unmapped: 39395328 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:38.956623+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124575744 unmapped: 39395328 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:39.956766+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124575744 unmapped: 39395328 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:40.956936+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124575744 unmapped: 39395328 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:41.957111+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124575744 unmapped: 39395328 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:42.957255+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124575744 unmapped: 39395328 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:43.957412+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124575744 unmapped: 39395328 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:44.957607+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124583936 unmapped: 39387136 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:45.957800+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124583936 unmapped: 39387136 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:46.957968+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124583936 unmapped: 39387136 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:47.958139+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124583936 unmapped: 39387136 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:48.958511+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124583936 unmapped: 39387136 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 243.777282715s of 244.093200684s, submitted: 27
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:49.958654+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124600320 unmapped: 39370752 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:50.958848+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124608512 unmapped: 39362560 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:51.959011+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124731392 unmapped: 39239680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:52.959280+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124747776 unmapped: 39223296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:53.959460+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124747776 unmapped: 39223296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:54.959604+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124747776 unmapped: 39223296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:55.959755+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124747776 unmapped: 39223296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:56.959906+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124747776 unmapped: 39223296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:57.959993+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124747776 unmapped: 39223296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:58.960088+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124747776 unmapped: 39223296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:21:59.960172+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124747776 unmapped: 39223296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:00.960266+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124747776 unmapped: 39223296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:01.960392+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124747776 unmapped: 39223296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:02.960511+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124747776 unmapped: 39223296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:03.960656+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124755968 unmapped: 39215104 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:04.960848+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124755968 unmapped: 39215104 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:05.960978+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124755968 unmapped: 39215104 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:06.961204+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124755968 unmapped: 39215104 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:07.961376+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124755968 unmapped: 39215104 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:08.961537+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124755968 unmapped: 39215104 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:09.961661+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124755968 unmapped: 39215104 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:10.961796+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124755968 unmapped: 39215104 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:11.961930+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:12.962051+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:13.962190+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:14.962370+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:15.962544+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:16.962731+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:17.962906+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:18.963034+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:19.963160+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:20.963344+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:21.963515+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:22.963680+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:23.963836+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:24.963992+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:25.964143+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 36.141582489s of 36.681442261s, submitted: 145
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124780544 unmapped: 39190528 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:26.964334+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124796928 unmapped: 39174144 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:27.964494+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124911616 unmapped: 39059456 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:28.964890+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:29.965284+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:30.966566+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:31.967470+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:32.967895+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:33.969098+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:34.969578+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:35.970004+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:36.970725+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:37.970921+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:38.971475+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:39.971829+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17035d000 session 0x55d170400960
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172096800
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:40.971958+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:41.972233+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:42.972523+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:43.972910+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:44.973261+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:45.973424+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:46.973817+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:47.974096+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:48.974244+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:49.974482+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:50.974782+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:51.975017+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:52.975204+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:53.975400+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:54.975624+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:55.975805+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:56.975986+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:57.976241+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:58.976444+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:22:59.976692+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:00.976924+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:01.977190+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:02.977415+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:03.977800+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:04.977999+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:05.978182+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:06.978363+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:07.978600+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:08.978765+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:09.979088+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:10.979410+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:11.979688+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:12.979887+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:13.980109+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:14.980283+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:15.980464+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:16.980664+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:17.980788+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:18.980976+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:19.981131+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:20.981288+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:21.981446+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:22.981597+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:23.981777+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:24.982051+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:25.982262+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:26.982479+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:27.982626+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:28.982753+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:29.982876+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:30.983013+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:31.983149+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:32.983285+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:33.983452+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:34.983628+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:35.983821+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:36.983955+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:37.984123+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:38.984267+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:39.984417+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:40.984537+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:41.984769+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:42.984924+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:43.985172+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:44.985351+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:45.985459+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:46.985668+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:47.985889+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:48.986024+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:49.986214+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:50.986329+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:51.986449+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:52.986597+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:53.986799+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:54.986953+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:55.987133+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:56.987326+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:57.987515+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:58.987682+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:23:59.987902+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:00.988043+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:01.988243+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:02.988404+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:03.988543+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:04.988774+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:05.988985+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:06.989878+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:07.992255+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:08.992469+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:09.995017+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:10.995680+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:11.995914+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125026304 unmapped: 38944768 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:12.996092+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125026304 unmapped: 38944768 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:13.996389+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125026304 unmapped: 38944768 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:14.996649+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125026304 unmapped: 38944768 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:15.996916+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125026304 unmapped: 38944768 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:16.997259+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 38936576 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:17.997587+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 38936576 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:18.997784+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 38936576 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:19.998107+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 38936576 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:20.998336+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 38936576 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:21.998515+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 38936576 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:22.998958+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 38936576 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:23.999131+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 38936576 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:24.999494+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 38936576 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:25.999811+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 38936576 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:27.000121+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 38936576 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:28.000373+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:29.000540+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:30.000735+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:31.000937+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:32.001046+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:33.001231+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:34.001350+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:35.001542+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:36.001795+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:37.002024+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:38.002183+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:39.002329+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:40.002470+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:41.002605+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:42.002757+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:43.002883+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:44.003020+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:45.003127+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:46.003257+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:47.003446+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:48.003607+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125050880 unmapped: 38920192 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:49.003776+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125050880 unmapped: 38920192 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:50.003867+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125050880 unmapped: 38920192 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:51.003992+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125050880 unmapped: 38920192 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets getting new tickets!
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:52.004317+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _finish_auth 0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:52.005595+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125059072 unmapped: 38912000 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:53.004426+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:54.004540+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125059072 unmapped: 38912000 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:55.004691+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125059072 unmapped: 38912000 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:56.004899+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125059072 unmapped: 38912000 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:57.005094+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125059072 unmapped: 38912000 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:58.005211+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125059072 unmapped: 38912000 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:24:59.005331+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125059072 unmapped: 38912000 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:00.005469+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125059072 unmapped: 38912000 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:01.005633+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125067264 unmapped: 38903808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:02.005823+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125067264 unmapped: 38903808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:03.005957+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125067264 unmapped: 38903808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:04.006146+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125067264 unmapped: 38903808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:05.006310+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125067264 unmapped: 38903808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:06.006458+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125067264 unmapped: 38903808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:07.006691+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125067264 unmapped: 38903808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:08.006928+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125067264 unmapped: 38903808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:09.007067+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125075456 unmapped: 38895616 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:10.007204+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125075456 unmapped: 38895616 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:11.007368+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125075456 unmapped: 38895616 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:12.007560+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125075456 unmapped: 38895616 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:13.007751+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125075456 unmapped: 38895616 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:14.007877+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125075456 unmapped: 38895616 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125075456 unmapped: 38895616 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:15.328651+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125075456 unmapped: 38895616 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:16.328874+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125075456 unmapped: 38895616 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:17.329089+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125075456 unmapped: 38895616 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:18.329324+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125075456 unmapped: 38895616 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:19.329533+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 38887424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:20.329912+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 38887424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:21.330076+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 38887424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:22.330220+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 38887424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:23.330359+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 38887424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:24.330488+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 38887424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:25.330669+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 38887424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:26.330857+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 38887424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:27.331036+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 38887424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:28.331174+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125091840 unmapped: 38879232 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:29.331364+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125091840 unmapped: 38879232 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:30.331547+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125091840 unmapped: 38879232 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:31.331662+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125091840 unmapped: 38879232 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:32.331799+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125100032 unmapped: 38871040 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:33.331933+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125100032 unmapped: 38871040 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:34.332031+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125100032 unmapped: 38871040 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:35.332162+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125100032 unmapped: 38871040 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:36.332361+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125100032 unmapped: 38871040 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:37.332614+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125100032 unmapped: 38871040 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:38.332831+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125100032 unmapped: 38871040 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:39.332975+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:40.333122+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:41.333337+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:42.333528+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:43.333656+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:44.333877+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:45.334057+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:46.334318+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:47.334509+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:48.334682+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:49.334891+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:50.335041+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:51.335219+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:52.335394+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125124608 unmapped: 38846464 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:53.335525+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125124608 unmapped: 38846464 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:54.335828+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125124608 unmapped: 38846464 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:55.336039+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125124608 unmapped: 38846464 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:56.336195+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125124608 unmapped: 38846464 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:57.336424+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125124608 unmapped: 38846464 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:58.336600+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125124608 unmapped: 38846464 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:25:59.336807+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 38838272 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:00.336985+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 38838272 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:01.337136+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 38838272 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:02.337288+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 38838272 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:03.337458+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 38838272 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:04.337626+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 38838272 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:05.337890+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 38838272 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:06.338087+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 38838272 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:07.338421+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 38838272 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:08.338646+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 38838272 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:09.338836+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 38838272 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:10.339006+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 38838272 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:11.339145+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125140992 unmapped: 38830080 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:12.339313+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125140992 unmapped: 38830080 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:13.339475+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125140992 unmapped: 38830080 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:14.339626+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125140992 unmapped: 38830080 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:15.340066+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125140992 unmapped: 38830080 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:16.340233+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125149184 unmapped: 38821888 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:17.340628+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:18.340815+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125149184 unmapped: 38821888 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:19.341172+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125149184 unmapped: 38821888 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:20.341612+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:21.341807+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:22.342081+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:23.342341+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:24.342560+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:25.342742+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:26.342920+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:27.343164+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:28.343288+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:29.343522+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:30.343906+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:31.344183+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:32.344508+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:33.344978+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125165568 unmapped: 38805504 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:34.345277+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125165568 unmapped: 38805504 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:35.345447+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125165568 unmapped: 38805504 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:36.345694+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125165568 unmapped: 38805504 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:37.345944+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125165568 unmapped: 38805504 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:38.346092+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125165568 unmapped: 38805504 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:39.346317+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125165568 unmapped: 38805504 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:40.346485+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125165568 unmapped: 38805504 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:41.346636+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125165568 unmapped: 38805504 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:42.346832+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:43.346975+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:44.347183+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:45.347383+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:46.347670+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:47.348004+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:48.348262+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:49.348464+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:50.348607+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:51.348753+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:52.348989+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:53.349185+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:54.349404+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:55.349558+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:56.349776+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:57.349968+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:58.350138+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:26:59.350337+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:00.350483+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 38780928 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:01.350635+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 38780928 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:02.350771+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 38780928 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:03.350924+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 38780928 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:04.351075+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 38780928 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:05.351204+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 38780928 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:06.351351+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 38780928 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:07.351526+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 38780928 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:08.351688+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 38780928 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:09.351840+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 38780928 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:10.351995+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 38772736 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:11.352172+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 38772736 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:12.352336+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 38772736 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:13.352490+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 38772736 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:14.352675+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 38772736 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:15.352793+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 38772736 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:16.352933+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 38756352 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:17.353169+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 38756352 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:18.353325+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 38756352 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:19.353478+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 38756352 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:20.354180+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 38756352 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:21.354899+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 38756352 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:22.355531+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 38756352 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:23.355941+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 38756352 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:24.356171+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 38756352 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:25.356572+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 38756352 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:26.356735+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 38756352 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172166800 session 0x55d1720365a0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172523000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:27.357129+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 38756352 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:28.357266+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 38748160 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:29.357568+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 38748160 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:30.357806+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 38748160 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:31.357986+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 38748160 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:32.358519+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 38748160 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:33.358776+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 38748160 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:34.359127+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 38748160 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:35.359393+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 38748160 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:36.359579+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 38748160 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:37.359799+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 38748160 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:38.359988+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 38748160 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:39.360146+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 38748160 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:40.360270+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 38739968 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:41.360448+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 38739968 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:42.360644+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 38739968 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:43.360846+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 38739968 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:44.361217+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 38739968 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:45.361413+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 38739968 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:46.361691+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125239296 unmapped: 38731776 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:47.361947+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125239296 unmapped: 38731776 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:48.362230+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125239296 unmapped: 38731776 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:49.362441+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125239296 unmapped: 38731776 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:50.362574+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 38723584 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:51.362780+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 38723584 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:52.363031+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 38723584 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:53.363207+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 38723584 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:54.363377+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 38723584 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:55.363571+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 38723584 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:56.363774+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 38723584 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:57.363968+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 38723584 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:58.364231+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 38723584 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:27:59.364383+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 38723584 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:00.364516+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 38723584 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:01.364644+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125255680 unmapped: 38715392 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:02.364825+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125255680 unmapped: 38715392 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:03.365038+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125255680 unmapped: 38715392 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:04.365241+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125263872 unmapped: 38707200 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:05.365443+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125263872 unmapped: 38707200 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:06.365731+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125263872 unmapped: 38707200 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:07.365963+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125263872 unmapped: 38707200 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:08.366131+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125263872 unmapped: 38707200 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:09.366316+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125263872 unmapped: 38707200 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:10.366393+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125263872 unmapped: 38707200 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:11.366550+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125263872 unmapped: 38707200 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:12.366896+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 38699008 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:13.367042+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 38699008 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:14.367238+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 38699008 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:15.367413+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 38699008 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:16.367556+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 38699008 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:17.367763+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 38699008 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:18.367874+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 38699008 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:19.368100+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 38699008 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:20.368216+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125280256 unmapped: 38690816 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:21.368363+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125280256 unmapped: 38690816 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:22.368501+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125280256 unmapped: 38690816 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:23.368830+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125288448 unmapped: 38682624 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:24.369674+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125288448 unmapped: 38682624 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:25.370604+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125288448 unmapped: 38682624 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:26.371672+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125288448 unmapped: 38682624 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:27.371914+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125288448 unmapped: 38682624 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:28.373441+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125296640 unmapped: 38674432 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:29.374129+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125296640 unmapped: 38674432 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:30.374362+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125296640 unmapped: 38674432 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:31.374479+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125304832 unmapped: 38666240 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:32.375227+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125304832 unmapped: 38666240 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:33.375563+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125304832 unmapped: 38666240 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:34.376098+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125304832 unmapped: 38666240 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:35.376293+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125304832 unmapped: 38666240 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:36.376512+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125304832 unmapped: 38666240 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:37.376762+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125304832 unmapped: 38666240 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:38.376954+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125304832 unmapped: 38666240 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:39.377381+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125304832 unmapped: 38666240 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:40.377738+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125313024 unmapped: 38658048 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:41.378127+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125313024 unmapped: 38658048 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:42.378455+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125313024 unmapped: 38658048 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:43.378826+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125313024 unmapped: 38658048 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:44.379038+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125313024 unmapped: 38658048 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:45.379216+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125313024 unmapped: 38658048 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:46.379438+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125313024 unmapped: 38658048 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:47.379967+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125313024 unmapped: 38658048 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:48.380159+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125321216 unmapped: 38649856 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:49.380304+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125321216 unmapped: 38649856 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:50.380516+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17116ac00 session 0x55d1731450e0
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172528c00
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172507800 session 0x55d172ee7c20
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: handle_auth_request added challenge on 0x55d172168000
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125329408 unmapped: 38641664 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:51.380745+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125329408 unmapped: 38641664 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:52.381001+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125337600 unmapped: 38633472 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:53.381190+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125337600 unmapped: 38633472 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:54.381405+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125337600 unmapped: 38633472 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:55.381579+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125337600 unmapped: 38633472 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:56.381743+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125337600 unmapped: 38633472 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:57.381882+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125337600 unmapped: 38633472 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:58.382040+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125337600 unmapped: 38633472 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:28:59.382156+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125337600 unmapped: 38633472 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:29:00.382294+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125337600 unmapped: 38633472 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:29:01.382393+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125337600 unmapped: 38633472 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:29:02.382527+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125337600 unmapped: 38633472 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:29:03.382675+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125337600 unmapped: 38633472 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:29:04.382792+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125353984 unmapped: 38617088 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:29:05.382910+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125353984 unmapped: 38617088 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:29:06.382986+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125353984 unmapped: 38617088 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:29:07.383175+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125353984 unmapped: 38617088 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:29:08.383327+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125353984 unmapped: 38617088 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:29:09.383497+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125353984 unmapped: 38617088 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:29:10.383662+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125353984 unmapped: 38617088 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:29:11.383804+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125353984 unmapped: 38617088 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:29:12.383934+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125353984 unmapped: 38617088 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:29:13.384049+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125353984 unmapped: 38617088 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:29:14.384199+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125353984 unmapped: 38617088 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:29:15.384330+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125362176 unmapped: 38608896 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:29:16.384449+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125362176 unmapped: 38608896 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:29:17.384597+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125370368 unmapped: 38600704 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:29:18.384733+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 10:29:53 compute-1 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 10:29:53 compute-1 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125370368 unmapped: 38600704 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:29:19.384861+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125370368 unmapped: 38600704 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:29:20.384993+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: do_command 'config diff' '{prefix=config diff}'
Jan 26 10:29:53 compute-1 ceph-osd[77632]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 26 10:29:53 compute-1 ceph-osd[77632]: do_command 'config show' '{prefix=config show}'
Jan 26 10:29:53 compute-1 ceph-osd[77632]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 26 10:29:53 compute-1 ceph-osd[77632]: do_command 'counter dump' '{prefix=counter dump}'
Jan 26 10:29:53 compute-1 ceph-osd[77632]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 26 10:29:53 compute-1 ceph-osd[77632]: do_command 'counter schema' '{prefix=counter schema}'
Jan 26 10:29:53 compute-1 ceph-osd[77632]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125444096 unmapped: 38526976 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:29:21.385114+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125804544 unmapped: 38166528 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: tick
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_tickets
Jan 26 10:29:53 compute-1 ceph-osd[77632]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-26T10:29:22.385262+0000)
Jan 26 10:29:53 compute-1 ceph-osd[77632]: do_command 'log dump' '{prefix=log dump}'
Jan 26 10:29:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:29:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:53.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:29:53 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:53 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:29:53 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:53.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:29:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:29:53.957 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:29:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:29:53.957 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:29:53 compute-1 ovn_metadata_agent[143321]: 2026-01-26 10:29:53.957 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:29:54 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 26 10:29:54 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3975494299' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 26 10:29:54 compute-1 ceph-mon[80107]: from='client.27941 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:54 compute-1 ceph-mon[80107]: from='client.18162 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:54 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3425527103' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 26 10:29:54 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2923609229' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 26 10:29:54 compute-1 ceph-mon[80107]: from='client.27962 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:54 compute-1 ceph-mon[80107]: from='client.18186 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:54 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3538019178' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 26 10:29:54 compute-1 ceph-mon[80107]: from='client.27604 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:54 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2468740989' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:29:54 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2445155982' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 26 10:29:54 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3579023726' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 26 10:29:54 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/624911500' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 26 10:29:54 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3977685281' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 26 10:29:54 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3975494299' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 26 10:29:54 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 26 10:29:54 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2206345920' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 26 10:29:54 compute-1 nova_compute[226322]: 2026-01-26 10:29:54.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:29:54 compute-1 crontab[252258]: (root) LIST (root)
Jan 26 10:29:54 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Jan 26 10:29:54 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2249248986' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 26 10:29:55 compute-1 ceph-mon[80107]: from='client.27983 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:55 compute-1 ceph-mon[80107]: from='client.18207 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:55 compute-1 ceph-mon[80107]: from='client.27625 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:55 compute-1 ceph-mon[80107]: from='client.18219 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:55 compute-1 ceph-mon[80107]: from='client.28004 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:55 compute-1 ceph-mon[80107]: pgmap v1439: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:29:55 compute-1 ceph-mon[80107]: from='client.27646 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:55 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/971384341' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 26 10:29:55 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2035482829' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 26 10:29:55 compute-1 ceph-mon[80107]: from='client.28025 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:55 compute-1 ceph-mon[80107]: from='client.18243 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:55 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/4234181687' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:29:55 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2206345920' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 26 10:29:55 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1628073812' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 26 10:29:55 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3756206751' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 26 10:29:55 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2249248986' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #82. Immutable memtables: 0.
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.363671) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 49] Flushing memtable with next log file: 82
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423395363729, "job": 49, "event": "flush_started", "num_memtables": 1, "num_entries": 1344, "num_deletes": 250, "total_data_size": 3092090, "memory_usage": 3120232, "flush_reason": "Manual Compaction"}
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 49] Level-0 flush table #83: started
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423395371048, "cf_name": "default", "job": 49, "event": "table_file_creation", "file_number": 83, "file_size": 1272752, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 42093, "largest_seqno": 43432, "table_properties": {"data_size": 1267847, "index_size": 2173, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 14079, "raw_average_key_size": 21, "raw_value_size": 1256855, "raw_average_value_size": 1948, "num_data_blocks": 95, "num_entries": 645, "num_filter_entries": 645, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769423294, "oldest_key_time": 1769423294, "file_creation_time": 1769423395, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 83, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 49] Flush lasted 7407 microseconds, and 3313 cpu microseconds.
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.371083) [db/flush_job.cc:967] [default] [JOB 49] Level-0 flush table #83: 1272752 bytes OK
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.371101) [db/memtable_list.cc:519] [default] Level-0 commit table #83 started
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.372676) [db/memtable_list.cc:722] [default] Level-0 commit table #83: memtable #1 done
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.372690) EVENT_LOG_v1 {"time_micros": 1769423395372687, "job": 49, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.372730) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 49] Try to delete WAL files size 3085468, prev total WAL file size 3085468, number of live WAL files 2.
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000079.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.373442) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031323530' seq:72057594037927935, type:22 .. '6D6772737461740031353031' seq:0, type:0; will stop at (end)
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 50] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 49 Base level 0, inputs: [83(1242KB)], [81(14MB)]
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423395373489, "job": 50, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [83], "files_L6": [81], "score": -1, "input_data_size": 16081373, "oldest_snapshot_seqno": -1}
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 50] Generated table #84: 7102 keys, 12714908 bytes, temperature: kUnknown
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423395430867, "cf_name": "default", "job": 50, "event": "table_file_creation", "file_number": 84, "file_size": 12714908, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12671863, "index_size": 24181, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17797, "raw_key_size": 187043, "raw_average_key_size": 26, "raw_value_size": 12548292, "raw_average_value_size": 1766, "num_data_blocks": 942, "num_entries": 7102, "num_filter_entries": 7102, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769423395, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 84, "seqno_to_time_mapping": "N/A"}}
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.431113) [db/compaction/compaction_job.cc:1663] [default] [JOB 50] Compacted 1@0 + 1@6 files to L6 => 12714908 bytes
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.450818) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 279.8 rd, 221.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 14.1 +0.0 blob) out(12.1 +0.0 blob), read-write-amplify(22.6) write-amplify(10.0) OK, records in: 7570, records dropped: 468 output_compression: NoCompression
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.450853) EVENT_LOG_v1 {"time_micros": 1769423395450840, "job": 50, "event": "compaction_finished", "compaction_time_micros": 57473, "compaction_time_cpu_micros": 23764, "output_level": 6, "num_output_files": 1, "total_output_size": 12714908, "num_input_records": 7570, "num_output_records": 7102, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000083.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423395451238, "job": 50, "event": "table_file_deletion", "file_number": 83}
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000081.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423395453451, "job": 50, "event": "table_file_deletion", "file_number": 81}
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.373374) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.453498) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.453503) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.453505) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.453507) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:29:55 compute-1 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.453509) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 10:29:55 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 26 10:29:55 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3093970821' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 26 10:29:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:55.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:55 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:55 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:55 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:55.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:55 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Jan 26 10:29:55 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/288116703' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 26 10:29:56 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Jan 26 10:29:56 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4002871036' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 26 10:29:56 compute-1 ceph-mon[80107]: from='client.27673 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:56 compute-1 ceph-mon[80107]: from='client.28052 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:56 compute-1 ceph-mon[80107]: from='client.18273 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:56 compute-1 ceph-mon[80107]: from='client.27691 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:56 compute-1 ceph-mon[80107]: from='client.28073 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:29:56 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/4240491322' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 26 10:29:56 compute-1 ceph-mon[80107]: from='client.18282 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:29:56 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3093970821' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 26 10:29:56 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/374892030' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 26 10:29:56 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/288116703' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 26 10:29:56 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/4002871036' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 26 10:29:56 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Jan 26 10:29:56 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4191693100' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 26 10:29:56 compute-1 nova_compute[226322]: 2026-01-26 10:29:56.436 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:29:56 compute-1 nova_compute[226322]: 2026-01-26 10:29:56.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 10:29:56 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Jan 26 10:29:56 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1771140372' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 26 10:29:56 compute-1 nova_compute[226322]: 2026-01-26 10:29:56.708 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:29:56 compute-1 nova_compute[226322]: 2026-01-26 10:29:56.709 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:29:56 compute-1 nova_compute[226322]: 2026-01-26 10:29:56.709 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:29:56 compute-1 nova_compute[226322]: 2026-01-26 10:29:56.709 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 10:29:56 compute-1 nova_compute[226322]: 2026-01-26 10:29:56.709 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:29:56 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Jan 26 10:29:56 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2958203862' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 26 10:29:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Jan 26 10:29:57 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2765354709' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 26 10:29:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:29:57 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1027645102' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:29:57 compute-1 nova_compute[226322]: 2026-01-26 10:29:57.172 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:29:57 compute-1 nova_compute[226322]: 2026-01-26 10:29:57.321 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 10:29:57 compute-1 nova_compute[226322]: 2026-01-26 10:29:57.322 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4517MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 10:29:57 compute-1 nova_compute[226322]: 2026-01-26 10:29:57.323 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 10:29:57 compute-1 nova_compute[226322]: 2026-01-26 10:29:57.323 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 10:29:57 compute-1 ceph-mon[80107]: from='client.27718 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:57 compute-1 ceph-mon[80107]: from='client.28097 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:29:57 compute-1 ceph-mon[80107]: from='client.18306 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:29:57 compute-1 ceph-mon[80107]: from='client.28115 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:29:57 compute-1 ceph-mon[80107]: from='client.18336 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:29:57 compute-1 ceph-mon[80107]: from='client.27739 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:29:57 compute-1 ceph-mon[80107]: pgmap v1440: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:29:57 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/574585194' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 26 10:29:57 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/4191693100' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 26 10:29:57 compute-1 ceph-mon[80107]: from='client.28136 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:29:57 compute-1 ceph-mon[80107]: from='client.18351 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:29:57 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/862678016' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 26 10:29:57 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1771140372' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 26 10:29:57 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2355982261' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 26 10:29:57 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2958203862' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 26 10:29:57 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2428474459' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 26 10:29:57 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1617965274' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 26 10:29:57 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2765354709' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 26 10:29:57 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1027645102' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:29:57 compute-1 nova_compute[226322]: 2026-01-26 10:29:57.387 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 10:29:57 compute-1 nova_compute[226322]: 2026-01-26 10:29:57.388 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 10:29:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Jan 26 10:29:57 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3989524323' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 26 10:29:57 compute-1 nova_compute[226322]: 2026-01-26 10:29:57.417 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 10:29:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Jan 26 10:29:57 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1625537296' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 26 10:29:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:57.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:57 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:57 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:29:57 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:57.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:29:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:29:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 10:29:57 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3510057928' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:29:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Jan 26 10:29:57 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2644301426' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 26 10:29:57 compute-1 nova_compute[226322]: 2026-01-26 10:29:57.891 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 10:29:57 compute-1 nova_compute[226322]: 2026-01-26 10:29:57.897 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 10:29:57 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Jan 26 10:29:57 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1763956724' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 26 10:29:57 compute-1 nova_compute[226322]: 2026-01-26 10:29:57.926 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 10:29:57 compute-1 nova_compute[226322]: 2026-01-26 10:29:57.928 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 10:29:57 compute-1 nova_compute[226322]: 2026-01-26 10:29:57.928 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.605s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 10:29:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:29:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:29:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:29:58 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:29:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 10:29:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/844049035' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:29:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 10:29:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/844049035' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:29:58 compute-1 systemd[1]: Starting Hostname Service...
Jan 26 10:29:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Jan 26 10:29:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/181875530' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 26 10:29:58 compute-1 systemd[1]: Started Hostname Service.
Jan 26 10:29:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Jan 26 10:29:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2958399404' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 26 10:29:58 compute-1 ceph-mon[80107]: from='client.27766 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:29:58 compute-1 ceph-mon[80107]: from='client.27781 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:29:58 compute-1 ceph-mon[80107]: from='client.28193 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:29:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3789585773' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 26 10:29:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3989524323' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 26 10:29:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3927723701' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 26 10:29:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1625537296' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 26 10:29:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/24183373' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 26 10:29:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1231297347' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 26 10:29:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1743187370' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 26 10:29:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3510057928' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 10:29:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2644301426' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 26 10:29:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1763956724' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 26 10:29:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/4256254640' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 26 10:29:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/844049035' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 10:29:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.10:0/844049035' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 10:29:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3318151391' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 26 10:29:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/131168614' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 26 10:29:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/181875530' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 26 10:29:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2135477831' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 26 10:29:58 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2958399404' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 26 10:29:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Jan 26 10:29:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3155950399' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 26 10:29:58 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Jan 26 10:29:58 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4061816220' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 26 10:29:59 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Jan 26 10:29:59 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3651764905' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 26 10:29:59 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Jan 26 10:29:59 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2654257985' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 26 10:29:59 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Jan 26 10:29:59 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/44222642' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 26 10:29:59 compute-1 ceph-mon[80107]: from='client.27808 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:29:59 compute-1 ceph-mon[80107]: pgmap v1441: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:29:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2484893466' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 26 10:29:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3600782431' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 26 10:29:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/624103412' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 26 10:29:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3155950399' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 26 10:29:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/4061816220' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 26 10:29:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2476322614' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 26 10:29:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2719211327' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 26 10:29:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3376148640' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 26 10:29:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3651764905' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 26 10:29:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/815486066' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 26 10:29:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2654257985' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 26 10:29:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/869627358' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 26 10:29:59 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/44222642' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 26 10:29:59 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Jan 26 10:29:59 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2665981540' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 26 10:29:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:59.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:29:59 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:29:59 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:29:59 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:59.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:30:00 compute-1 ceph-mon[80107]: from='client.18504 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:30:00 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/69190768' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 26 10:30:00 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2665981540' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 26 10:30:00 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/4233254299' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 26 10:30:00 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1343797294' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 26 10:30:00 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/4102320703' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 26 10:30:00 compute-1 ceph-mon[80107]: overall HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore; 1 failed cephadm daemon(s)
Jan 26 10:30:00 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/1960142830' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 26 10:30:00 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3121584314' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 26 10:30:00 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Jan 26 10:30:00 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/367892269' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 26 10:30:01 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Jan 26 10:30:01 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1413975470' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 26 10:30:01 compute-1 nova_compute[226322]: 2026-01-26 10:30:01.438 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:30:01 compute-1 nova_compute[226322]: 2026-01-26 10:30:01.441 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:30:01 compute-1 ceph-mon[80107]: from='client.28322 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:30:01 compute-1 ceph-mon[80107]: from='client.18516 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:30:01 compute-1 ceph-mon[80107]: from='client.28340 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:30:01 compute-1 ceph-mon[80107]: from='client.28346 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:30:01 compute-1 ceph-mon[80107]: from='client.18522 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:30:01 compute-1 ceph-mon[80107]: pgmap v1442: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:30:01 compute-1 ceph-mon[80107]: from='client.28355 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:30:01 compute-1 ceph-mon[80107]: from='client.18546 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:30:01 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3687375884' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 26 10:30:01 compute-1 ceph-mon[80107]: from='client.28379 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:30:01 compute-1 ceph-mon[80107]: from='client.18564 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:30:01 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1939256608' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 26 10:30:01 compute-1 ceph-mon[80107]: from='client.27943 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:30:01 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/367892269' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 26 10:30:01 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/4029590274' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 26 10:30:01 compute-1 ceph-mon[80107]: from='client.28400 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:30:01 compute-1 ceph-mon[80107]: from='client.18576 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:30:01 compute-1 ceph-mon[80107]: from='client.27967 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:30:01 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2020780984' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 26 10:30:01 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1413975470' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 26 10:30:01 compute-1 ceph-mon[80107]: from='client.27973 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:30:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:30:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:30:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:30:01.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:30:01 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:30:01 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:30:01 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:30:01.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:30:01 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Jan 26 10:30:01 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2067073806' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 26 10:30:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Jan 26 10:30:02 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1558495807' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 26 10:30:02 compute-1 ceph-mon[80107]: from='client.18591 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:30:02 compute-1 ceph-mon[80107]: from='client.18597 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:30:02 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/4096719302' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 26 10:30:02 compute-1 ceph-mon[80107]: from='client.27991 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:30:02 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2067073806' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 26 10:30:02 compute-1 ceph-mon[80107]: from='client.28436 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:30:02 compute-1 ceph-mon[80107]: from='client.18612 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:30:02 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1558495807' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 26 10:30:02 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1857393829' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 26 10:30:02 compute-1 ceph-mon[80107]: from='client.28012 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:30:02 compute-1 ceph-mon[80107]: pgmap v1443: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:30:02 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3882675605' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 26 10:30:02 compute-1 ceph-mon[80107]: from='client.28460 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:30:02 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 10:30:02 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 10:30:02 compute-1 ceph-mon[80107]: from='client.18630 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:30:02 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 10:30:02 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 10:30:02 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 10:30:02 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 10:30:02 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 10:30:02 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 10:30:02 compute-1 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 10:30:02 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 10:30:02 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 10:30:02 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 10:30:02 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 10:30:03 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 10:30:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:30:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 10:30:03 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 10:30:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:30:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 10:30:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:30:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 10:30:03 compute-1 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:30:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 10:30:03 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 10:30:03 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 10:30:03 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Jan 26 10:30:03 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1342520033' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 26 10:30:03 compute-1 ceph-mon[80107]: from='client.28021 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:30:03 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 10:30:03 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 10:30:03 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 10:30:03 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 10:30:03 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/824904298' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 26 10:30:03 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 10:30:03 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 10:30:03 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 10:30:03 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 10:30:03 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 10:30:03 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 10:30:03 compute-1 ceph-mon[80107]: from='client.28057 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:30:03 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 10:30:03 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 10:30:03 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 10:30:03 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 10:30:03 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 10:30:03 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 10:30:03 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/4207651308' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 26 10:30:03 compute-1 ceph-mon[80107]: from='client.28087 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:30:03 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1331959644' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 26 10:30:03 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/1342520033' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 26 10:30:03 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/3931590477' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 26 10:30:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:30:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:30:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:30:03.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:30:03 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:30:03 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 10:30:03 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:30:03.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 10:30:04 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Jan 26 10:30:04 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/186538420' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 26 10:30:04 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 10:30:04 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 10:30:04 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 10:30:04 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 10:30:04 compute-1 podman[253721]: 2026-01-26 10:30:04.525201683 +0000 UTC m=+0.104673359 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 10:30:04 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Jan 26 10:30:04 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2070446568' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 26 10:30:04 compute-1 ceph-mon[80107]: from='client.28547 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:30:04 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 10:30:04 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 10:30:04 compute-1 ceph-mon[80107]: from='client.18711 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:30:04 compute-1 ceph-mon[80107]: from='client.28108 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 26 10:30:04 compute-1 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 26 10:30:04 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 10:30:04 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 10:30:04 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 10:30:04 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 10:30:04 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/186538420' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 26 10:30:04 compute-1 ceph-mon[80107]: pgmap v1444: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Jan 26 10:30:04 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 10:30:04 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 10:30:04 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3993199471' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 26 10:30:04 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 10:30:04 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 10:30:04 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 10:30:04 compute-1 ceph-mon[80107]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 10:30:05 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0)
Jan 26 10:30:05 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3069245116' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 26 10:30:05 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0)
Jan 26 10:30:05 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3783749478' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Jan 26 10:30:05 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/2070446568' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 26 10:30:05 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/3430923927' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 26 10:30:05 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/4021159621' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 26 10:30:05 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3069245116' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 26 10:30:05 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/2923029982' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 26 10:30:05 compute-1 ceph-mon[80107]: from='client.28604 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:30:05 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3783749478' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Jan 26 10:30:05 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1239806339' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Jan 26 10:30:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:30:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:30:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:30:05.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:30:05 compute-1 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 10:30:05 compute-1 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 10:30:05 compute-1 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:30:05.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 10:30:06 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0)
Jan 26 10:30:06 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3130148119' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Jan 26 10:30:06 compute-1 nova_compute[226322]: 2026-01-26 10:30:06.443 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:30:06 compute-1 nova_compute[226322]: 2026-01-26 10:30:06.444 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 10:30:06 compute-1 nova_compute[226322]: 2026-01-26 10:30:06.444 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 10:30:06 compute-1 nova_compute[226322]: 2026-01-26 10:30:06.445 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:30:06 compute-1 nova_compute[226322]: 2026-01-26 10:30:06.485 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 10:30:06 compute-1 nova_compute[226322]: 2026-01-26 10:30:06.486 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 10:30:06 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/249839602' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 26 10:30:06 compute-1 ceph-mon[80107]: from='client.28619 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:30:06 compute-1 ceph-mon[80107]: from='client.18774 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Jan 26 10:30:06 compute-1 ceph-mon[80107]: pgmap v1445: 353 pgs: 353 active+clean; 41 MiB data, 256 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Jan 26 10:30:06 compute-1 ceph-mon[80107]: from='client.? 192.168.122.102:0/2754641701' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 26 10:30:06 compute-1 ceph-mon[80107]: from='client.? 192.168.122.101:0/3130148119' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Jan 26 10:30:06 compute-1 ceph-mon[80107]: from='client.? 192.168.122.100:0/1015018809' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Jan 26 10:30:06 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump"} v 0)
Jan 26 10:30:06 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1329857520' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Jan 26 10:30:07 compute-1 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls"} v 0)
Jan 26 10:30:07 compute-1 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1209724897' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
